Mar 19 20:05:21 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 20:05:21 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:21 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 20:05:22 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 20:05:22 crc kubenswrapper[4799]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 20:05:22 crc kubenswrapper[4799]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 20:05:22 crc kubenswrapper[4799]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 20:05:22 crc kubenswrapper[4799]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 20:05:22 crc kubenswrapper[4799]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 20:05:22 crc kubenswrapper[4799]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.836436 4799 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844653 4799 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844696 4799 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844708 4799 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844724 4799 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844739 4799 feature_gate.go:330] unrecognized feature gate: Example Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844753 4799 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844764 4799 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844774 4799 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844784 4799 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844792 4799 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844801 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844809 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844816 4799 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844824 4799 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844831 4799 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844839 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844846 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844854 4799 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844861 4799 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844869 4799 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844880 4799 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844889 4799 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844899 4799 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844908 4799 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844930 4799 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844939 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844947 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844955 4799 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844962 4799 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844970 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844977 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844986 4799 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.844995 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845004 4799 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845012 4799 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845020 4799 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845028 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845035 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845043 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845052 4799 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845060 4799 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845069 4799 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845079 4799 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845089 4799 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845097 4799 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845105 4799 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845113 4799 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845121 4799 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845129 4799 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845136 4799 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845143 4799 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845153 4799 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845160 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845168 4799 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845176 4799 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845183 4799 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845191 4799 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845199 4799 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845207 4799 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845215 4799 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845222 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845229 4799 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845237 4799 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845244 4799 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845252 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845259 4799 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845267 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845275 4799 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845282 4799 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845290 4799 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.845297 4799 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846339 4799 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846366 4799 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846413 4799 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846425 4799 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846437 4799 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846446 4799 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846458 4799 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846470 4799 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846479 4799 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846488 4799 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846498 4799 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846507 4799 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846518 4799 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846527 4799 flags.go:64] FLAG: --cgroup-root="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846536 4799 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846546 4799 flags.go:64] FLAG: --client-ca-file="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846555 4799 flags.go:64] FLAG: --cloud-config="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846564 4799 flags.go:64] FLAG: --cloud-provider="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846573 4799 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846584 4799 flags.go:64] FLAG: --cluster-domain="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846592 4799 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846602 4799 flags.go:64] FLAG: --config-dir="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846611 4799 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846621 4799 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846632 4799 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846642 4799 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846651 4799 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846662 4799 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846672 4799 flags.go:64] FLAG: --contention-profiling="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846680 4799 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846689 4799 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846708 4799 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846716 4799 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846728 4799 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846737 4799 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846745 4799 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846754 4799 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846764 4799 flags.go:64] FLAG: --enable-server="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846774 4799 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846785 4799 flags.go:64] FLAG: --event-burst="100" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846794 4799 flags.go:64] FLAG: --event-qps="50" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846803 4799 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846812 4799 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846821 4799 flags.go:64] FLAG: --eviction-hard="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846832 4799 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846841 4799 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846850 4799 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846859 4799 flags.go:64] FLAG: --eviction-soft="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846868 4799 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846877 4799 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846886 4799 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846895 4799 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846904 4799 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846913 4799 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846922 4799 flags.go:64] FLAG: --feature-gates="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846933 4799 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846941 4799 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846950 4799 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846959 4799 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846969 4799 flags.go:64] FLAG: --healthz-port="10248" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846978 4799 flags.go:64] FLAG: --help="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846986 4799 flags.go:64] FLAG: --hostname-override="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.846995 4799 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847007 4799 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847016 4799 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847025 4799 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847034 4799 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847043 4799 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847051 4799 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847060 4799 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847069 4799 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847079 4799 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847088 4799 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847098 4799 flags.go:64] FLAG: --kube-reserved="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847106 4799 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847117 4799 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847126 4799 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847135 4799 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847143 4799 flags.go:64] FLAG: --lock-file="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847152 4799 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847161 4799 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847170 4799 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847190 4799 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847199 4799 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847208 4799 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847217 4799 flags.go:64] FLAG: --logging-format="text" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847226 4799 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847236 4799 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847244 4799 flags.go:64] FLAG: --manifest-url="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847253 4799 flags.go:64] FLAG: --manifest-url-header="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847265 4799 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847274 4799 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847285 4799 flags.go:64] FLAG: --max-pods="110" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847294 4799 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847302 4799 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847314 4799 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847323 4799 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847332 4799 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847342 4799 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847351 4799 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847371 4799 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847380 4799 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847415 4799 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847425 4799 flags.go:64] FLAG: --pod-cidr="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847433 4799 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847447 4799 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847456 4799 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847465 4799 flags.go:64] FLAG: --pods-per-core="0" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847474 4799 flags.go:64] FLAG: --port="10250" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847507 4799 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847516 4799 flags.go:64] FLAG: --provider-id="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847525 4799 flags.go:64] FLAG: --qos-reserved="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847534 4799 flags.go:64] FLAG: --read-only-port="10255" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847543 4799 flags.go:64] FLAG: --register-node="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847552 4799 flags.go:64] FLAG: --register-schedulable="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847561 4799 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847576 4799 flags.go:64] FLAG: --registry-burst="10" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847585 4799 flags.go:64] FLAG: --registry-qps="5" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847594 4799 flags.go:64] FLAG: --reserved-cpus="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847603 4799 flags.go:64] FLAG: --reserved-memory="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847614 4799 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847623 4799 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847633 4799 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847641 4799 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847650 4799 flags.go:64] FLAG: --runonce="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847660 4799 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847669 4799 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847679 4799 flags.go:64] FLAG: --seccomp-default="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847691 4799 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847700 4799 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847709 4799 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847718 4799 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847728 4799 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847737 4799 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847745 4799 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847754 4799 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847763 4799 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847773 4799 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847782 4799 flags.go:64] FLAG: --system-cgroups="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847790 4799 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847804 4799 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847813 4799 flags.go:64] FLAG: --tls-cert-file="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847821 4799 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847833 4799 flags.go:64] FLAG: --tls-min-version="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847841 4799 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847851 4799 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847859 4799 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847869 4799 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847878 4799 flags.go:64] FLAG: --v="2" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847889 4799 flags.go:64] FLAG: --version="false" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847900 4799 flags.go:64] FLAG: --vmodule="" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847911 4799 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.847920 4799 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848112 4799 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848124 4799 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848132 4799 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848140 4799 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848151 4799 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848160 4799 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848168 4799 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848179 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848187 4799 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848194 4799 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848203 4799 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848210 4799 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848218 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848225 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848233 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848241 4799 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848249 4799 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848256 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848264 4799 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848274 4799 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848284 4799 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848294 4799 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848302 4799 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848310 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848318 4799 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848326 4799 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848335 4799 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848344 4799 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848355 4799 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848363 4799 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848372 4799 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848380 4799 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848414 4799 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848426 4799 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848435 4799 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848444 4799 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848452 4799 feature_gate.go:330] unrecognized feature gate: Example Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848460 4799 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848470 4799 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848480 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848489 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848497 4799 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848507 4799 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848515 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848522 4799 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848530 4799 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848537 4799 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848545 4799 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848554 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848561 4799 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848569 4799 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848576 4799 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848584 4799 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848591 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848599 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848606 4799 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848614 4799 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848624 4799 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848633 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848641 4799 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848650 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848658 4799 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848666 4799 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848674 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848684 4799 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848692 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848702 4799 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848710 4799 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848719 4799 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848727 4799 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.848734 4799 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.848760 4799 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.862737 4799 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.862833 4799 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.862975 4799 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863001 4799 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863010 4799 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863024 4799 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863039 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863049 4799 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863058 4799 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863066 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863074 4799 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863082 4799 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863090 4799 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863097 4799 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863108 4799 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863122 4799 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863131 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863139 4799 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863148 4799 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863156 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863164 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863172 4799 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863181 4799 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863188 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863196 4799 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863204 4799 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863214 4799 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863224 4799 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863232 4799 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863240 4799 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863248 4799 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863256 4799 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863266 4799 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863275 4799 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863284 4799 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863294 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863306 4799 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863316 4799 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863324 4799 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863332 4799 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863340 4799 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863348 4799 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863356 4799 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863364 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863372 4799 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863380 4799 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863413 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863421 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863430 4799 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863437 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863446 4799 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863454 4799 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863496 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863504 4799 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863512 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863520 4799 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863528 4799 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863539 4799 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863550 4799 feature_gate.go:330] unrecognized feature gate: Example Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863558 4799 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863566 4799 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863575 4799 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863583 4799 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863591 4799 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863599 4799 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863607 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863614 4799 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863622 4799 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863630 4799 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863637 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863645 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863652 4799 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863662 4799 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.863677 4799 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863927 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863941 4799 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863951 4799 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863960 4799 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863968 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863976 4799 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863984 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.863993 4799 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864001 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864011 4799 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864019 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864028 4799 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864036 4799 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864044 4799 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864052 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864060 4799 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864068 4799 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864080 4799 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864093 4799 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864102 4799 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864111 4799 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864119 4799 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864128 4799 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864136 4799 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864147 4799 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864157 4799 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864166 4799 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864175 4799 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864184 4799 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864193 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864202 4799 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864210 4799 feature_gate.go:330] unrecognized feature gate: Example Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864218 4799 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864225 4799 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864234 4799 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864242 4799 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864253 4799 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864262 4799 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864270 4799 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864278 4799 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864286 4799 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864293 4799 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864303 4799 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864312 4799 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864320 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864328 4799 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864337 4799 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864345 4799 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864356 4799 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864366 4799 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864375 4799 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864409 4799 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864417 4799 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864426 4799 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864434 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864442 4799 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864450 4799 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864458 4799 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864465 4799 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864473 4799 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864481 4799 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864489 4799 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864497 4799 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864505 4799 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864513 4799 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864520 4799 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864528 4799 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864536 4799 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864544 4799 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864552 4799 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 20:05:22 crc kubenswrapper[4799]: W0319 20:05:22.864570 4799 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.864583 4799 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.865009 4799 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 20:05:22 crc kubenswrapper[4799]: E0319 20:05:22.876042 4799 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.882280 4799 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.882516 4799 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.884829 4799 server.go:997] "Starting client certificate rotation" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.884898 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.885758 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.911160 4799 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 20:05:22 crc kubenswrapper[4799]: E0319 20:05:22.914239 4799 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.915823 4799 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.947017 4799 log.go:25] "Validated CRI v1 runtime API" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.985684 4799 log.go:25] "Validated CRI v1 image API" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.987934 4799 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.993672 4799 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-20-00-25-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 20:05:22 crc kubenswrapper[4799]: I0319 20:05:22.993694 4799 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.009907 4799 manager.go:217] Machine: {Timestamp:2026-03-19 20:05:23.005246833 +0000 UTC m=+0.611199925 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:26a95011-54a2-4e43-8b10-a5eb1bfc734c BootID:e11fefaa-2228-43f6-a9fc-d0587c913216 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b8:6f:1e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b8:6f:1e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:50:a1:71 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:87:21:31 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4a:28:25 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e1:31:54 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:9c:98:d8:15:4d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:c6:f0:93:40:c9:da Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.010124 4799 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.010260 4799 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.013980 4799 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.014315 4799 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.014413 4799 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.014790 4799 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.014809 4799 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.015490 4799 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.015533 4799 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.015777 4799 state_mem.go:36] "Initialized new in-memory state store" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.015935 4799 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.020728 4799 kubelet.go:418] "Attempting to sync node with API server" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.020763 4799 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.020840 4799 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.020861 4799 kubelet.go:324] "Adding apiserver pod source" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.021327 4799 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.027727 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.027803 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.027899 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.027965 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.028131 4799 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.029614 4799 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.032446 4799 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034080 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034112 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034122 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034131 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034147 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034156 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034193 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034236 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034246 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034257 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034283 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.034293 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.036619 4799 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.041631 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.044070 4799 server.go:1280] "Started kubelet" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.046641 4799 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.046700 4799 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 20:05:23 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.047461 4799 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.048086 4799 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.048157 4799 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.048333 4799 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.048347 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.048417 4799 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.048365 4799 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.050106 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.050217 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.050480 4799 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.050541 4799 factory.go:55] Registering systemd factory Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.050557 4799 factory.go:221] Registration of the systemd container factory successfully Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.050748 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="200ms" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.054971 4799 factory.go:153] Registering CRI-O factory Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.055018 4799 factory.go:221] Registration of the crio container factory successfully Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.055073 4799 factory.go:103] Registering Raw factory Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.055106 4799 manager.go:1196] Started watching for new ooms in manager Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.054503 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e56c8a815d8bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,LastTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.056138 4799 server.go:460] "Adding debug handlers to kubelet server" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.056933 4799 manager.go:319] Starting recovery of all containers Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063572 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063663 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063687 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063705 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063756 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063784 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063807 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063828 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063852 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063871 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063891 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063909 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063926 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063953 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063973 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.063992 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064016 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064036 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064053 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064073 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064092 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064110 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064125 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064140 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064156 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.064172 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.071920 4799 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072055 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072133 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072168 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072214 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072245 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072290 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072327 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072356 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072439 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072473 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072520 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072552 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072582 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072623 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072655 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072701 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072733 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072764 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072804 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072833 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072874 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072910 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072942 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.072982 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073045 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073090 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073136 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073185 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073231 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073265 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073308 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073340 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073380 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073446 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073486 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073518 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073554 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073595 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073625 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073663 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073694 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073725 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073939 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073967 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.073996 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074085 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074119 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074165 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074195 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074223 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074262 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074293 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074419 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074460 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074490 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074529 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074562 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074602 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074631 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074659 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074698 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074727 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074766 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074797 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074829 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074869 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074896 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.074934 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075275 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075324 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075345 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075378 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075426 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075446 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075475 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075499 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075524 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075544 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.075599 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077332 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077369 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077452 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077479 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077508 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077559 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077582 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077612 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077645 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077667 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077686 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077715 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.077734 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078114 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078136 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078160 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078174 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078193 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078288 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078316 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078344 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078363 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078414 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078443 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078462 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078488 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078507 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.078527 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080065 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080108 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080135 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080159 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080178 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080202 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080224 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080249 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080271 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080291 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080732 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080807 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080845 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080861 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080883 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080899 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080932 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080947 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080962 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080978 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.080991 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081008 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081021 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081038 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081057 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081072 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081093 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081114 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081133 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081153 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081166 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.081191 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084297 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084315 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084329 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084342 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084356 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084369 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084402 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084422 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084435 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084451 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084465 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084478 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084491 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084503 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084516 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084529 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084541 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084553 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084566 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084578 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084590 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084602 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084615 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084627 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084640 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084653 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084665 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084677 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084689 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084701 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084713 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084725 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084737 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084749 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084762 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084774 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084786 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084797 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084809 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084822 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084835 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084847 4799 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084860 4799 reconstruct.go:97] "Volume reconstruction finished" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.084871 4799 reconciler.go:26] "Reconciler: start to sync state" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.097068 4799 manager.go:324] Recovery completed Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.110312 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.112740 4799 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.112942 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.112995 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.113004 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.113984 4799 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.113998 4799 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.114016 4799 state_mem.go:36] "Initialized new in-memory state store" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.114723 4799 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.114784 4799 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.114816 4799 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.114956 4799 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.115628 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.115698 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.118977 4799 policy_none.go:49] "None policy: Start" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.120579 4799 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.120611 4799 state_mem.go:35] "Initializing new in-memory state store" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.149020 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.189924 4799 manager.go:334] "Starting Device Plugin manager" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.189983 4799 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.189999 4799 server.go:79] "Starting device plugin registration server" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.190610 4799 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.190627 4799 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.190839 4799 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.191036 4799 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.192865 4799 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.203371 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.215559 4799 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.215638 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.217482 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.217505 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.217514 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.217600 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.217863 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.217930 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.218313 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.218348 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.218360 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.218554 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.218686 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.218736 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219031 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219076 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219089 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219486 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219516 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219528 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219576 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219595 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219609 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219724 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219849 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.219890 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220308 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220344 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220355 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220466 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220554 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220578 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220590 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220595 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.220624 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221064 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221091 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221103 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221221 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221243 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221506 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221532 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221545 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221837 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221864 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.221876 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.251558 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="400ms" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286368 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286420 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286440 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286456 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286724 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286777 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286800 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.286993 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287249 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287296 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287327 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287356 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287424 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287572 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.287624 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.291363 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.292539 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.292584 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.292601 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.292638 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.293242 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.388998 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389049 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389073 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389098 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389112 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389128 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389146 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389164 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389181 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389196 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389212 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389225 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389290 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389326 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389336 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389229 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389346 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389377 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389394 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389339 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389361 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389447 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389467 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389511 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389504 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389523 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389566 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389526 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389665 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.389809 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.494106 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.495586 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.495641 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.495658 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.495699 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.496264 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.558341 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.585342 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.599540 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.627043 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.627253 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5681226fb15f728fce3987a089185c965a53bd75bf12ab17f26240f63976e074 WatchSource:0}: Error finding container 5681226fb15f728fce3987a089185c965a53bd75bf12ab17f26240f63976e074: Status 404 returned error can't find the container with id 5681226fb15f728fce3987a089185c965a53bd75bf12ab17f26240f63976e074 Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.629736 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-bd891e77fca38d23f2a0161fa7b06bc37d38b701420fb66657440ef513df329d WatchSource:0}: Error finding container bd891e77fca38d23f2a0161fa7b06bc37d38b701420fb66657440ef513df329d: Status 404 returned error can't find the container with id bd891e77fca38d23f2a0161fa7b06bc37d38b701420fb66657440ef513df329d Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.634281 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0c6460e86317bf2d410d9a1fa01c2e1c99c7532a8f38d614841db56b031b0970 WatchSource:0}: Error finding container 0c6460e86317bf2d410d9a1fa01c2e1c99c7532a8f38d614841db56b031b0970: Status 404 returned error can't find the container with id 0c6460e86317bf2d410d9a1fa01c2e1c99c7532a8f38d614841db56b031b0970 Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.636789 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.652943 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="800ms" Mar 19 20:05:23 crc kubenswrapper[4799]: W0319 20:05:23.676400 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-015cd7e7aaaa74abb8d3e3218789ca2e2eba7f6e8e1a7a1ec0b7b878fe4fa066 WatchSource:0}: Error finding container 015cd7e7aaaa74abb8d3e3218789ca2e2eba7f6e8e1a7a1ec0b7b878fe4fa066: Status 404 returned error can't find the container with id 015cd7e7aaaa74abb8d3e3218789ca2e2eba7f6e8e1a7a1ec0b7b878fe4fa066 Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.896406 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.898322 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.898365 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.898395 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:23 crc kubenswrapper[4799]: I0319 20:05:23.898422 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:23 crc kubenswrapper[4799]: E0319 20:05:23.898922 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.042724 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:24 crc kubenswrapper[4799]: W0319 20:05:24.061623 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:24 crc kubenswrapper[4799]: E0319 20:05:24.061705 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:24 crc kubenswrapper[4799]: W0319 20:05:24.106338 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:24 crc kubenswrapper[4799]: E0319 20:05:24.106425 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.119327 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a30a19d1ccd38c697ffc260349ce37c14d085dd53cdab71b9785ced1d957ea58"} Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.121543 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0c6460e86317bf2d410d9a1fa01c2e1c99c7532a8f38d614841db56b031b0970"} Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.122664 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5681226fb15f728fce3987a089185c965a53bd75bf12ab17f26240f63976e074"} Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.126787 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bd891e77fca38d23f2a0161fa7b06bc37d38b701420fb66657440ef513df329d"} Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.128336 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"015cd7e7aaaa74abb8d3e3218789ca2e2eba7f6e8e1a7a1ec0b7b878fe4fa066"} Mar 19 20:05:24 crc kubenswrapper[4799]: W0319 20:05:24.295339 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:24 crc kubenswrapper[4799]: E0319 20:05:24.295498 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:24 crc kubenswrapper[4799]: W0319 20:05:24.318175 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:24 crc kubenswrapper[4799]: E0319 20:05:24.318260 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:24 crc kubenswrapper[4799]: E0319 20:05:24.454269 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="1.6s" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.699284 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.701892 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.701950 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.701962 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:24 crc kubenswrapper[4799]: I0319 20:05:24.702013 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:24 crc kubenswrapper[4799]: E0319 20:05:24.702548 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.043012 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.117250 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 20:05:25 crc kubenswrapper[4799]: E0319 20:05:25.118662 4799 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.133571 4799 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5b884f451a3d65fd133626ae013412d1b588733f3213ab39f05173d2b93e0753" exitCode=0 Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.133688 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.133718 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5b884f451a3d65fd133626ae013412d1b588733f3213ab39f05173d2b93e0753"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.135041 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.135081 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.135094 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.136046 4799 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="32a6fe3330a416309935e99b2426a5766d091c1ae15f299c1f5f54d0560f222f" exitCode=0 Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.136198 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.137022 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"32a6fe3330a416309935e99b2426a5766d091c1ae15f299c1f5f54d0560f222f"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.138921 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.138948 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.138960 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.142554 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.142760 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0290ecba9dc3939316501eb7fa44b2fb9fb86e08d24dc93d97db8446e64b0339"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.142787 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3c0de3a23cb0b667a9a5d2e56ac388897ff389ff34d1659a2a923898a6f7d40c"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.142801 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b496680738d8c631ca436432cde204cdaa2ccc12c50d2de4ae45b76d102243aa"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.142812 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f2c37e72f55493fc298426243c3f3dbb349d51b07a336295c38878065e68c831"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.143785 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.143810 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.143820 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.144946 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac" exitCode=0 Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.144990 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.145063 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.145800 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.145820 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.145830 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.147168 4799 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2276e9c088a89169f9b943fef52d433e969b943f5396cce4c571b78ecfb8bb21" exitCode=0 Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.147197 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2276e9c088a89169f9b943fef52d433e969b943f5396cce4c571b78ecfb8bb21"} Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.147296 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.148180 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.148200 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.148212 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.150596 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.151600 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.151626 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.151640 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:25 crc kubenswrapper[4799]: I0319 20:05:25.795966 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.042871 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:26 crc kubenswrapper[4799]: E0319 20:05:26.055590 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="3.2s" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.152756 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.152979 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.153059 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.153131 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.157716 4799 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f7c2aa2eb744caffc6824cdb485be33287ec078acb8e94ccc32735fce17b92b1" exitCode=0 Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.157786 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f7c2aa2eb744caffc6824cdb485be33287ec078acb8e94ccc32735fce17b92b1"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.157886 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.159765 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.159794 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.159805 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.161223 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4a00850a0278faea9f58b76a4ceca284b9b079782e3f41eaddca7029f58d9d4b"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.161417 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.162860 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.162876 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.162884 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.165748 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166051 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"26878dc7e08c5e4a6fba591494203ce94b084c7568ecc9dbfb59d5be167e8828"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166057 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166069 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"64bdb5f6c869404fa95fef7cd70d7be563521414f89c928a2221791f2f606e82"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166153 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ebd02add4d3ee1c05d6ce9ee72d3fd80862cb1a0fa130eb4ecb0375e99ff8ca6"} Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166560 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166573 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.166581 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.168348 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.168361 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.168369 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:26 crc kubenswrapper[4799]: W0319 20:05:26.276532 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:26 crc kubenswrapper[4799]: E0319 20:05:26.276703 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.303275 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.314514 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.314566 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.314579 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:26 crc kubenswrapper[4799]: I0319 20:05:26.314618 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:26 crc kubenswrapper[4799]: E0319 20:05:26.315228 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Mar 19 20:05:26 crc kubenswrapper[4799]: W0319 20:05:26.499841 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:26 crc kubenswrapper[4799]: E0319 20:05:26.499922 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:26 crc kubenswrapper[4799]: W0319 20:05:26.542679 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Mar 19 20:05:26 crc kubenswrapper[4799]: E0319 20:05:26.542769 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.178654 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b1a95aefb21fa7daea097dcfd4177a8ace4584e870c51a4ea00f26fed4b7a447"} Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.178858 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.180186 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.180239 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.180257 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.182569 4799 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="301c44346587e2ce6dd3d4a6f63c77e3f535357bc3c1363db076da56bb56d8ef" exitCode=0 Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.182696 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.182721 4799 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.182727 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"301c44346587e2ce6dd3d4a6f63c77e3f535357bc3c1363db076da56bb56d8ef"} Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.182776 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.182835 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.183569 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184058 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184105 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184130 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184318 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184347 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184363 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184525 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184576 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.184602 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.185767 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.185809 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.185825 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:27 crc kubenswrapper[4799]: I0319 20:05:27.631790 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.189621 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.190487 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4f48b4f6ef14e19b6113b4aa9fda19db002643a961096107cb7f6bfb3b2e7047"} Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.190537 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"860d7958f65038e73fc1da4ce75871ff87ad50d0cd539e5134783fdc0d6651a1"} Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.190561 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"04f2273bf039511770d2c470732d564b0482d9045b6be6f80c0fe8e4826d3f9d"} Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.190586 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.191125 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.191186 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.191208 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.523336 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.523672 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.525161 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.525241 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:28 crc kubenswrapper[4799]: I0319 20:05:28.525269 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.198075 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.198869 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.199101 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3bd4e9ffb5ad1289770f36968c6e68e73524c03a78e8ac50d6aea4399e20b4ff"} Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.199275 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3522ecf64a4b1cfb171130e73e8e4c0800d9747fc2da006ed1ab30a07d80510c"} Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.199609 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.199803 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.200100 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.200062 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.200474 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.200503 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.466810 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.516285 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.518177 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.518235 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.518258 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.518299 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:29 crc kubenswrapper[4799]: I0319 20:05:29.744039 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.176643 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.200554 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.200600 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.201939 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.201985 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.202002 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.202099 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.202158 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:30 crc kubenswrapper[4799]: I0319 20:05:30.202182 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.204259 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.205720 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.205772 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.205789 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.485136 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.774308 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.774570 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.776084 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.776168 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:31 crc kubenswrapper[4799]: I0319 20:05:31.776189 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.206991 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.208594 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.208650 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.208673 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.232236 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.232469 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.233757 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.233830 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.233862 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.241818 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:32 crc kubenswrapper[4799]: I0319 20:05:32.610289 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:33 crc kubenswrapper[4799]: E0319 20:05:33.203765 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:05:33 crc kubenswrapper[4799]: I0319 20:05:33.209639 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:33 crc kubenswrapper[4799]: I0319 20:05:33.211011 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:33 crc kubenswrapper[4799]: I0319 20:05:33.211063 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:33 crc kubenswrapper[4799]: I0319 20:05:33.211080 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.211865 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.213105 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.213161 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.213175 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.217959 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.775167 4799 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:05:34 crc kubenswrapper[4799]: I0319 20:05:34.775253 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:05:35 crc kubenswrapper[4799]: I0319 20:05:35.214719 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:35 crc kubenswrapper[4799]: I0319 20:05:35.215851 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:35 crc kubenswrapper[4799]: I0319 20:05:35.215902 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:35 crc kubenswrapper[4799]: I0319 20:05:35.215921 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:37 crc kubenswrapper[4799]: I0319 20:05:37.043015 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 20:05:37 crc kubenswrapper[4799]: W0319 20:05:37.206854 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 20:05:37 crc kubenswrapper[4799]: I0319 20:05:37.206992 4799 trace.go:236] Trace[22601217]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 20:05:27.205) (total time: 10001ms): Mar 19 20:05:37 crc kubenswrapper[4799]: Trace[22601217]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:05:37.206) Mar 19 20:05:37 crc kubenswrapper[4799]: Trace[22601217]: [10.001457882s] [10.001457882s] END Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.207026 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.355276 4799 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.358342 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.361858 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189e56c8a815d8bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,LastTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:05:37 crc kubenswrapper[4799]: I0319 20:05:37.366821 4799 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 20:05:37 crc kubenswrapper[4799]: I0319 20:05:37.366889 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 20:05:37 crc kubenswrapper[4799]: W0319 20:05:37.372123 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.372227 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:37 crc kubenswrapper[4799]: W0319 20:05:37.372628 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.372674 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:37 crc kubenswrapper[4799]: I0319 20:05:37.374603 4799 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 20:05:37 crc kubenswrapper[4799]: I0319 20:05:37.374666 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 20:05:37 crc kubenswrapper[4799]: W0319 20:05:37.374827 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.374869 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:37 crc kubenswrapper[4799]: E0319 20:05:37.376570 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:37Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.047569 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:38Z is after 2026-02-23T05:33:13Z Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.225326 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.228546 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b1a95aefb21fa7daea097dcfd4177a8ace4584e870c51a4ea00f26fed4b7a447" exitCode=255 Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.228605 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b1a95aefb21fa7daea097dcfd4177a8ace4584e870c51a4ea00f26fed4b7a447"} Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.228812 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.230227 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.230293 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.230313 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:38 crc kubenswrapper[4799]: I0319 20:05:38.231153 4799 scope.go:117] "RemoveContainer" containerID="b1a95aefb21fa7daea097dcfd4177a8ace4584e870c51a4ea00f26fed4b7a447" Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.048070 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:39Z is after 2026-02-23T05:33:13Z Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.234052 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.235914 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83"} Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.236177 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.237763 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.237826 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.237849 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:39 crc kubenswrapper[4799]: I0319 20:05:39.752771 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.047850 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:40Z is after 2026-02-23T05:33:13Z Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.217523 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.217762 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.219265 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.219342 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.219362 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.237415 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.240750 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.241552 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.244261 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" exitCode=255 Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.244376 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83"} Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.244429 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.244461 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.244473 4799 scope.go:117] "RemoveContainer" containerID="b1a95aefb21fa7daea097dcfd4177a8ace4584e870c51a4ea00f26fed4b7a447" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.245897 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.245957 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.245975 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.246276 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.246331 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.246357 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.246840 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:05:40 crc kubenswrapper[4799]: E0319 20:05:40.247178 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:05:40 crc kubenswrapper[4799]: I0319 20:05:40.251260 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:40 crc kubenswrapper[4799]: W0319 20:05:40.846554 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:40Z is after 2026-02-23T05:33:13Z Mar 19 20:05:40 crc kubenswrapper[4799]: E0319 20:05:40.846657 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.047794 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:41Z is after 2026-02-23T05:33:13Z Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.249608 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.252523 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.253768 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.253842 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.253866 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:41 crc kubenswrapper[4799]: I0319 20:05:41.254727 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:05:41 crc kubenswrapper[4799]: E0319 20:05:41.255007 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:05:42 crc kubenswrapper[4799]: I0319 20:05:42.046732 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:42Z is after 2026-02-23T05:33:13Z Mar 19 20:05:42 crc kubenswrapper[4799]: I0319 20:05:42.254563 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:42 crc kubenswrapper[4799]: I0319 20:05:42.255722 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:42 crc kubenswrapper[4799]: I0319 20:05:42.255821 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:42 crc kubenswrapper[4799]: I0319 20:05:42.255856 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:42 crc kubenswrapper[4799]: I0319 20:05:42.256536 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:05:42 crc kubenswrapper[4799]: E0319 20:05:42.256727 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.047420 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:43Z is after 2026-02-23T05:33:13Z Mar 19 20:05:43 crc kubenswrapper[4799]: E0319 20:05:43.203900 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.646500 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.646755 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.648074 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.648140 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.648162 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.649059 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:05:43 crc kubenswrapper[4799]: E0319 20:05:43.649351 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.758901 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.760369 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.760477 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.760491 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:43 crc kubenswrapper[4799]: I0319 20:05:43.760519 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:43 crc kubenswrapper[4799]: E0319 20:05:43.764836 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 20:05:43 crc kubenswrapper[4799]: E0319 20:05:43.780967 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.047689 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:44Z is after 2026-02-23T05:33:13Z Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.125756 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.259647 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.261031 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.261092 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.261118 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.262169 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:05:44 crc kubenswrapper[4799]: E0319 20:05:44.262497 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.776093 4799 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:05:44 crc kubenswrapper[4799]: I0319 20:05:44.776198 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:05:45 crc kubenswrapper[4799]: I0319 20:05:45.047251 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:45Z is after 2026-02-23T05:33:13Z Mar 19 20:05:45 crc kubenswrapper[4799]: W0319 20:05:45.339494 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:45Z is after 2026-02-23T05:33:13Z Mar 19 20:05:45 crc kubenswrapper[4799]: E0319 20:05:45.339711 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:45 crc kubenswrapper[4799]: I0319 20:05:45.992505 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 20:05:45 crc kubenswrapper[4799]: E0319 20:05:45.998603 4799 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:46 crc kubenswrapper[4799]: I0319 20:05:46.047143 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:46Z is after 2026-02-23T05:33:13Z Mar 19 20:05:47 crc kubenswrapper[4799]: I0319 20:05:47.047109 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:47Z is after 2026-02-23T05:33:13Z Mar 19 20:05:47 crc kubenswrapper[4799]: E0319 20:05:47.367979 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e56c8a815d8bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,LastTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:05:47 crc kubenswrapper[4799]: W0319 20:05:47.884007 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:47Z is after 2026-02-23T05:33:13Z Mar 19 20:05:47 crc kubenswrapper[4799]: E0319 20:05:47.884092 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:48 crc kubenswrapper[4799]: I0319 20:05:48.048032 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:48Z is after 2026-02-23T05:33:13Z Mar 19 20:05:49 crc kubenswrapper[4799]: I0319 20:05:49.047827 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:49Z is after 2026-02-23T05:33:13Z Mar 19 20:05:49 crc kubenswrapper[4799]: W0319 20:05:49.138877 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:49Z is after 2026-02-23T05:33:13Z Mar 19 20:05:49 crc kubenswrapper[4799]: E0319 20:05:49.139009 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:49 crc kubenswrapper[4799]: W0319 20:05:49.489944 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:49Z is after 2026-02-23T05:33:13Z Mar 19 20:05:49 crc kubenswrapper[4799]: E0319 20:05:49.490049 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:05:50 crc kubenswrapper[4799]: I0319 20:05:50.048344 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:50Z is after 2026-02-23T05:33:13Z Mar 19 20:05:50 crc kubenswrapper[4799]: I0319 20:05:50.765701 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:50 crc kubenswrapper[4799]: I0319 20:05:50.767835 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:50 crc kubenswrapper[4799]: I0319 20:05:50.767911 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:50 crc kubenswrapper[4799]: I0319 20:05:50.767930 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:50 crc kubenswrapper[4799]: I0319 20:05:50.768046 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:50 crc kubenswrapper[4799]: E0319 20:05:50.774667 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 20:05:50 crc kubenswrapper[4799]: E0319 20:05:50.786958 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:50Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 20:05:51 crc kubenswrapper[4799]: I0319 20:05:51.047969 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:51Z is after 2026-02-23T05:33:13Z Mar 19 20:05:52 crc kubenswrapper[4799]: I0319 20:05:52.049244 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:52Z is after 2026-02-23T05:33:13Z Mar 19 20:05:53 crc kubenswrapper[4799]: I0319 20:05:53.046960 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:53Z is after 2026-02-23T05:33:13Z Mar 19 20:05:53 crc kubenswrapper[4799]: E0319 20:05:53.204049 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.047875 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:54Z is after 2026-02-23T05:33:13Z Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.775314 4799 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.775466 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.775549 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.775751 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.777369 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.777481 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.777506 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.778326 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b496680738d8c631ca436432cde204cdaa2ccc12c50d2de4ae45b76d102243aa"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 20:05:54 crc kubenswrapper[4799]: I0319 20:05:54.778615 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://b496680738d8c631ca436432cde204cdaa2ccc12c50d2de4ae45b76d102243aa" gracePeriod=30 Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.048265 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:55Z is after 2026-02-23T05:33:13Z Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.295737 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.296378 4799 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b496680738d8c631ca436432cde204cdaa2ccc12c50d2de4ae45b76d102243aa" exitCode=255 Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.296538 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b496680738d8c631ca436432cde204cdaa2ccc12c50d2de4ae45b76d102243aa"} Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.296604 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"637c951a577022ed4f39f3a880fbcf16a21819c2c0a4dbce9ee2b47bab9bcbd6"} Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.296817 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.298346 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.298460 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.298485 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:55 crc kubenswrapper[4799]: I0319 20:05:55.796826 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:05:56 crc kubenswrapper[4799]: I0319 20:05:56.047559 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:56Z is after 2026-02-23T05:33:13Z Mar 19 20:05:56 crc kubenswrapper[4799]: I0319 20:05:56.299513 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:56 crc kubenswrapper[4799]: I0319 20:05:56.301094 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:56 crc kubenswrapper[4799]: I0319 20:05:56.301153 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:56 crc kubenswrapper[4799]: I0319 20:05:56.301177 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:57 crc kubenswrapper[4799]: I0319 20:05:57.046856 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:57Z is after 2026-02-23T05:33:13Z Mar 19 20:05:57 crc kubenswrapper[4799]: E0319 20:05:57.373268 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:57Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e56c8a815d8bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,LastTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:05:57 crc kubenswrapper[4799]: I0319 20:05:57.775187 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:57 crc kubenswrapper[4799]: I0319 20:05:57.777182 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:57 crc kubenswrapper[4799]: I0319 20:05:57.777241 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:57 crc kubenswrapper[4799]: I0319 20:05:57.777263 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:57 crc kubenswrapper[4799]: I0319 20:05:57.777302 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:05:57 crc kubenswrapper[4799]: E0319 20:05:57.782170 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 20:05:57 crc kubenswrapper[4799]: E0319 20:05:57.792112 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 20:05:58 crc kubenswrapper[4799]: I0319 20:05:58.047891 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:58Z is after 2026-02-23T05:33:13Z Mar 19 20:05:59 crc kubenswrapper[4799]: I0319 20:05:59.047938 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:05:59Z is after 2026-02-23T05:33:13Z Mar 19 20:05:59 crc kubenswrapper[4799]: I0319 20:05:59.115933 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:05:59 crc kubenswrapper[4799]: I0319 20:05:59.117307 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:05:59 crc kubenswrapper[4799]: I0319 20:05:59.117373 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:05:59 crc kubenswrapper[4799]: I0319 20:05:59.117426 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:05:59 crc kubenswrapper[4799]: I0319 20:05:59.119208 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.047279 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:00Z is after 2026-02-23T05:33:13Z Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.317244 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.318060 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.320848 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" exitCode=255 Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.320896 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254"} Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.320957 4799 scope.go:117] "RemoveContainer" containerID="1d326d2620282dc9156192bfbf160530c4e889b4383873d81064d1798c36fb83" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.321133 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.322520 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.322568 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.322585 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:00 crc kubenswrapper[4799]: I0319 20:06:00.323332 4799 scope.go:117] "RemoveContainer" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" Mar 19 20:06:00 crc kubenswrapper[4799]: E0319 20:06:00.323806 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.047808 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:01Z is after 2026-02-23T05:33:13Z Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.326500 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.774751 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.775005 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.776766 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.776816 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:01 crc kubenswrapper[4799]: I0319 20:06:01.776838 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:02 crc kubenswrapper[4799]: I0319 20:06:02.046612 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:02Z is after 2026-02-23T05:33:13Z Mar 19 20:06:02 crc kubenswrapper[4799]: W0319 20:06:02.319881 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:02Z is after 2026-02-23T05:33:13Z Mar 19 20:06:02 crc kubenswrapper[4799]: E0319 20:06:02.319986 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:06:02 crc kubenswrapper[4799]: I0319 20:06:02.390555 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 20:06:02 crc kubenswrapper[4799]: E0319 20:06:02.393762 4799 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:06:02 crc kubenswrapper[4799]: E0319 20:06:02.394927 4799 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 20:06:02 crc kubenswrapper[4799]: W0319 20:06:02.805134 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:02Z is after 2026-02-23T05:33:13Z Mar 19 20:06:02 crc kubenswrapper[4799]: E0319 20:06:02.805213 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:06:03 crc kubenswrapper[4799]: W0319 20:06:03.022833 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:03Z is after 2026-02-23T05:33:13Z Mar 19 20:06:03 crc kubenswrapper[4799]: E0319 20:06:03.022942 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.045722 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:03Z is after 2026-02-23T05:33:13Z Mar 19 20:06:03 crc kubenswrapper[4799]: E0319 20:06:03.204171 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.646810 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.647097 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.648462 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.648494 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.648527 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:03 crc kubenswrapper[4799]: I0319 20:06:03.649010 4799 scope.go:117] "RemoveContainer" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" Mar 19 20:06:03 crc kubenswrapper[4799]: E0319 20:06:03.649212 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.047879 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:04Z is after 2026-02-23T05:33:13Z Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.126444 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.338074 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.339449 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.339511 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.339531 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.340455 4799 scope.go:117] "RemoveContainer" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" Mar 19 20:06:04 crc kubenswrapper[4799]: E0319 20:06:04.340751 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.775458 4799 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.775580 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.782699 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.784136 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.784195 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.784214 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:04 crc kubenswrapper[4799]: I0319 20:06:04.784252 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:06:04 crc kubenswrapper[4799]: E0319 20:06:04.791091 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:04Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 20:06:04 crc kubenswrapper[4799]: E0319 20:06:04.797995 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:04Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 20:06:05 crc kubenswrapper[4799]: I0319 20:06:05.045605 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:05Z is after 2026-02-23T05:33:13Z Mar 19 20:06:06 crc kubenswrapper[4799]: I0319 20:06:06.046810 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:06Z is after 2026-02-23T05:33:13Z Mar 19 20:06:07 crc kubenswrapper[4799]: I0319 20:06:07.047175 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:07Z is after 2026-02-23T05:33:13Z Mar 19 20:06:07 crc kubenswrapper[4799]: E0319 20:06:07.379080 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:07Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e56c8a815d8bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,LastTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:08 crc kubenswrapper[4799]: I0319 20:06:08.047137 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:08Z is after 2026-02-23T05:33:13Z Mar 19 20:06:09 crc kubenswrapper[4799]: I0319 20:06:09.047235 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:09Z is after 2026-02-23T05:33:13Z Mar 19 20:06:10 crc kubenswrapper[4799]: I0319 20:06:10.047607 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:10Z is after 2026-02-23T05:33:13Z Mar 19 20:06:11 crc kubenswrapper[4799]: I0319 20:06:11.047464 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:11Z is after 2026-02-23T05:33:13Z Mar 19 20:06:11 crc kubenswrapper[4799]: W0319 20:06:11.188437 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:11Z is after 2026-02-23T05:33:13Z Mar 19 20:06:11 crc kubenswrapper[4799]: E0319 20:06:11.188532 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 20:06:11 crc kubenswrapper[4799]: I0319 20:06:11.792074 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:11 crc kubenswrapper[4799]: I0319 20:06:11.793556 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:11 crc kubenswrapper[4799]: I0319 20:06:11.793617 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:11 crc kubenswrapper[4799]: I0319 20:06:11.793635 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:11 crc kubenswrapper[4799]: I0319 20:06:11.793668 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:06:11 crc kubenswrapper[4799]: E0319 20:06:11.798615 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:11Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 20:06:11 crc kubenswrapper[4799]: E0319 20:06:11.803186 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:11Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 20:06:12 crc kubenswrapper[4799]: I0319 20:06:12.047964 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:12Z is after 2026-02-23T05:33:13Z Mar 19 20:06:13 crc kubenswrapper[4799]: I0319 20:06:13.048518 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:06:13Z is after 2026-02-23T05:33:13Z Mar 19 20:06:13 crc kubenswrapper[4799]: E0319 20:06:13.204316 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:06:14 crc kubenswrapper[4799]: I0319 20:06:14.050964 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:14 crc kubenswrapper[4799]: I0319 20:06:14.775371 4799 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:06:14 crc kubenswrapper[4799]: I0319 20:06:14.775552 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:06:15 crc kubenswrapper[4799]: I0319 20:06:15.048537 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:15 crc kubenswrapper[4799]: I0319 20:06:15.115416 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:15 crc kubenswrapper[4799]: I0319 20:06:15.116907 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:15 crc kubenswrapper[4799]: I0319 20:06:15.116966 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:15 crc kubenswrapper[4799]: I0319 20:06:15.116992 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:15 crc kubenswrapper[4799]: I0319 20:06:15.117887 4799 scope.go:117] "RemoveContainer" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" Mar 19 20:06:15 crc kubenswrapper[4799]: E0319 20:06:15.118163 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:16 crc kubenswrapper[4799]: I0319 20:06:16.050988 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:17 crc kubenswrapper[4799]: I0319 20:06:17.049471 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.387719 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8a815d8bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,LastTimestamp:2026-03-19 20:05:23.044006079 +0000 UTC m=+0.649959191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.393472 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.398515 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.400851 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.405171 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8b1293024 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.19626858 +0000 UTC m=+0.802221662,LastTimestamp:2026-03-19 20:05:23.19626858 +0000 UTC m=+0.802221662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.407469 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.217498624 +0000 UTC m=+0.823451686,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.414141 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.217511015 +0000 UTC m=+0.823464087,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.420721 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32c4ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.217520535 +0000 UTC m=+0.823473607,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.427808 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.218331602 +0000 UTC m=+0.824284684,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.434523 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.218355583 +0000 UTC m=+0.824308665,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.441986 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32c4ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.218366293 +0000 UTC m=+0.824319375,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.449105 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.219061296 +0000 UTC m=+0.825014378,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.456868 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.219084517 +0000 UTC m=+0.825037599,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.464120 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32c4ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.219096207 +0000 UTC m=+0.825049289,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.470852 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.219503411 +0000 UTC m=+0.825456483,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.477793 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.219524162 +0000 UTC m=+0.825477234,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.484524 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32c4ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.219534292 +0000 UTC m=+0.825487364,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.490592 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.219589404 +0000 UTC m=+0.825542476,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.497501 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.219603704 +0000 UTC m=+0.825556776,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.501274 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32c4ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.219614545 +0000 UTC m=+0.825567617,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.506854 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.220328368 +0000 UTC m=+0.826281440,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.513490 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.220351849 +0000 UTC m=+0.826304921,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.520195 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32c4ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32c4ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113010348 +0000 UTC m=+0.718963420,LastTimestamp:2026-03-19 20:05:23.220361769 +0000 UTC m=+0.826314841,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.526681 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac325b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac325b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.112983357 +0000 UTC m=+0.718936429,LastTimestamp:2026-03-19 20:05:23.220572436 +0000 UTC m=+0.826525518,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.533171 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e56c8ac32a0ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e56c8ac32a0ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.113001198 +0000 UTC m=+0.718954270,LastTimestamp:2026-03-19 20:05:23.220586077 +0000 UTC m=+0.826539159,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.541782 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c8cb3d63e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.633800165 +0000 UTC m=+1.239753267,LastTimestamp:2026-03-19 20:05:23.633800165 +0000 UTC m=+1.239753267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.548509 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e56c8cb5351ca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.635237322 +0000 UTC m=+1.241190434,LastTimestamp:2026-03-19 20:05:23.635237322 +0000 UTC m=+1.241190434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.555570 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c8cb5e88dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.635972317 +0000 UTC m=+1.241925429,LastTimestamp:2026-03-19 20:05:23.635972317 +0000 UTC m=+1.241925429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.562737 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8cc3442dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.649979101 +0000 UTC m=+1.255932183,LastTimestamp:2026-03-19 20:05:23.649979101 +0000 UTC m=+1.255932183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.569924 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c8cdeb71d2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:23.678761426 +0000 UTC m=+1.284714538,LastTimestamp:2026-03-19 20:05:23.678761426 +0000 UTC m=+1.284714538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.577234 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c8edfca3fe openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.216759294 +0000 UTC m=+1.822712396,LastTimestamp:2026-03-19 20:05:24.216759294 +0000 UTC m=+1.822712396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.583811 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c8edfd15c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.216788425 +0000 UTC m=+1.822741487,LastTimestamp:2026-03-19 20:05:24.216788425 +0000 UTC m=+1.822741487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.590534 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c8edff1f17 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.216921879 +0000 UTC m=+1.822874991,LastTimestamp:2026-03-19 20:05:24.216921879 +0000 UTC m=+1.822874991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.596970 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e56c8ee06b633 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.217419315 +0000 UTC m=+1.823372427,LastTimestamp:2026-03-19 20:05:24.217419315 +0000 UTC m=+1.823372427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.603661 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8ee16447b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.218438779 +0000 UTC m=+1.824391851,LastTimestamp:2026-03-19 20:05:24.218438779 +0000 UTC m=+1.824391851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.610234 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8eeb99fa4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.229144484 +0000 UTC m=+1.835097556,LastTimestamp:2026-03-19 20:05:24.229144484 +0000 UTC m=+1.835097556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.616687 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8eed4776f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.230903663 +0000 UTC m=+1.836856735,LastTimestamp:2026-03-19 20:05:24.230903663 +0000 UTC m=+1.836856735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.623305 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c8ef0c4484 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.234560644 +0000 UTC m=+1.840513726,LastTimestamp:2026-03-19 20:05:24.234560644 +0000 UTC m=+1.840513726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.630785 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e56c8ef43f579 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.238210425 +0000 UTC m=+1.844163497,LastTimestamp:2026-03-19 20:05:24.238210425 +0000 UTC m=+1.844163497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.637457 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c8ef493bb7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.238556087 +0000 UTC m=+1.844509159,LastTimestamp:2026-03-19 20:05:24.238556087 +0000 UTC m=+1.844509159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.644626 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c8ef4ee587 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.238927239 +0000 UTC m=+1.844880311,LastTimestamp:2026-03-19 20:05:24.238927239 +0000 UTC m=+1.844880311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.650984 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8ffe7aa52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.517374546 +0000 UTC m=+2.123327628,LastTimestamp:2026-03-19 20:05:24.517374546 +0000 UTC m=+2.123327628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.658083 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c9008d17d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.528216016 +0000 UTC m=+2.134169098,LastTimestamp:2026-03-19 20:05:24.528216016 +0000 UTC m=+2.134169098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.664698 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c9009b5dc3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.529151427 +0000 UTC m=+2.135104509,LastTimestamp:2026-03-19 20:05:24.529151427 +0000 UTC m=+2.135104509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.672019 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c9105d01ba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.79350009 +0000 UTC m=+2.399453202,LastTimestamp:2026-03-19 20:05:24.79350009 +0000 UTC m=+2.399453202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.678524 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c91157be43 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.809932355 +0000 UTC m=+2.415885507,LastTimestamp:2026-03-19 20:05:24.809932355 +0000 UTC m=+2.415885507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.685104 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c9116da10b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.811366667 +0000 UTC m=+2.417319779,LastTimestamp:2026-03-19 20:05:24.811366667 +0000 UTC m=+2.417319779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.691830 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c9202617e6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.058336742 +0000 UTC m=+2.664289864,LastTimestamp:2026-03-19 20:05:25.058336742 +0000 UTC m=+2.664289864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.698273 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c921094f33 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.073227571 +0000 UTC m=+2.679180653,LastTimestamp:2026-03-19 20:05:25.073227571 +0000 UTC m=+2.679180653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.708726 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e56c924d678d0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.137004752 +0000 UTC m=+2.742957844,LastTimestamp:2026-03-19 20:05:25.137004752 +0000 UTC m=+2.742957844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.717579 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c925156eb6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.141130934 +0000 UTC m=+2.747084016,LastTimestamp:2026-03-19 20:05:25.141130934 +0000 UTC m=+2.747084016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.725628 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c925a3df89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.150465929 +0000 UTC m=+2.756419011,LastTimestamp:2026-03-19 20:05:25.150465929 +0000 UTC m=+2.756419011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.733534 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c925b07de7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.151292903 +0000 UTC m=+2.757246015,LastTimestamp:2026-03-19 20:05:25.151292903 +0000 UTC m=+2.757246015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.740576 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c932b5adbd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.369736637 +0000 UTC m=+2.975689709,LastTimestamp:2026-03-19 20:05:25.369736637 +0000 UTC m=+2.975689709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.747236 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e56c932bdda10 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.370272272 +0000 UTC m=+2.976225344,LastTimestamp:2026-03-19 20:05:25.370272272 +0000 UTC m=+2.976225344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.754371 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c932c42503 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.370684675 +0000 UTC m=+2.976637747,LastTimestamp:2026-03-19 20:05:25.370684675 +0000 UTC m=+2.976637747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.761000 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c9330f24d6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.37559983 +0000 UTC m=+2.981552892,LastTimestamp:2026-03-19 20:05:25.37559983 +0000 UTC m=+2.981552892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.767505 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c933a8c697 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.385668247 +0000 UTC m=+2.991621319,LastTimestamp:2026-03-19 20:05:25.385668247 +0000 UTC m=+2.991621319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.774575 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c933bbd8eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.386918123 +0000 UTC m=+2.992871195,LastTimestamp:2026-03-19 20:05:25.386918123 +0000 UTC m=+2.992871195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.781231 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c933c4148f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.387457679 +0000 UTC m=+2.993410751,LastTimestamp:2026-03-19 20:05:25.387457679 +0000 UTC m=+2.993410751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.787648 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e56c933ca753a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.387875642 +0000 UTC m=+2.993828724,LastTimestamp:2026-03-19 20:05:25.387875642 +0000 UTC m=+2.993828724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.794085 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c933e060e0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.389312224 +0000 UTC m=+2.995265296,LastTimestamp:2026-03-19 20:05:25.389312224 +0000 UTC m=+2.995265296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.800540 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c93f30e7d9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.579139033 +0000 UTC m=+3.185092105,LastTimestamp:2026-03-19 20:05:25.579139033 +0000 UTC m=+3.185092105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.807512 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c93f3e2bcf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.580008399 +0000 UTC m=+3.185961481,LastTimestamp:2026-03-19 20:05:25.580008399 +0000 UTC m=+3.185961481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.818241 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c93fd7dbf0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.590080496 +0000 UTC m=+3.196033598,LastTimestamp:2026-03-19 20:05:25.590080496 +0000 UTC m=+3.196033598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.822709 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c93fdb437d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.590303613 +0000 UTC m=+3.196256685,LastTimestamp:2026-03-19 20:05:25.590303613 +0000 UTC m=+3.196256685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.824813 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c94030024e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.595857486 +0000 UTC m=+3.201810558,LastTimestamp:2026-03-19 20:05:25.595857486 +0000 UTC m=+3.201810558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.829704 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c9403065e1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.595882977 +0000 UTC m=+3.201836049,LastTimestamp:2026-03-19 20:05:25.595882977 +0000 UTC m=+3.201836049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.832884 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c94c69833c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.800952636 +0000 UTC m=+3.406905708,LastTimestamp:2026-03-19 20:05:25.800952636 +0000 UTC m=+3.406905708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.836466 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c94caabe34 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.805227572 +0000 UTC m=+3.411180644,LastTimestamp:2026-03-19 20:05:25.805227572 +0000 UTC m=+3.411180644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.839780 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c94d594bb3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.816667059 +0000 UTC m=+3.422620131,LastTimestamp:2026-03-19 20:05:25.816667059 +0000 UTC m=+3.422620131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.843228 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c94d7fbf00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.819186944 +0000 UTC m=+3.425140026,LastTimestamp:2026-03-19 20:05:25.819186944 +0000 UTC m=+3.425140026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.846450 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e56c94da78753 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.821794131 +0000 UTC m=+3.427747233,LastTimestamp:2026-03-19 20:05:25.821794131 +0000 UTC m=+3.427747233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.849572 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9529596d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:25.90450453 +0000 UTC m=+3.510457612,LastTimestamp:2026-03-19 20:05:25.90450453 +0000 UTC m=+3.510457612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.853594 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c958a76392 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.006334354 +0000 UTC m=+3.612287426,LastTimestamp:2026-03-19 20:05:26.006334354 +0000 UTC m=+3.612287426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.858979 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c9598bd613 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.021305875 +0000 UTC m=+3.627258957,LastTimestamp:2026-03-19 20:05:26.021305875 +0000 UTC m=+3.627258957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.865842 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c9599aa642 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.022276674 +0000 UTC m=+3.628229756,LastTimestamp:2026-03-19 20:05:26.022276674 +0000 UTC m=+3.628229756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.872824 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c961ef5b8d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.162045837 +0000 UTC m=+3.767998929,LastTimestamp:2026-03-19 20:05:26.162045837 +0000 UTC m=+3.767998929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.881034 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c9640f0bed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.197677037 +0000 UTC m=+3.803630109,LastTimestamp:2026-03-19 20:05:26.197677037 +0000 UTC m=+3.803630109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.887618 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c964ba1d7b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.208888187 +0000 UTC m=+3.814841289,LastTimestamp:2026-03-19 20:05:26.208888187 +0000 UTC m=+3.814841289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.894890 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c96f10cbe6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.382341094 +0000 UTC m=+3.988294166,LastTimestamp:2026-03-19 20:05:26.382341094 +0000 UTC m=+3.988294166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.901783 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c96ffc349b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.397768859 +0000 UTC m=+4.003721971,LastTimestamp:2026-03-19 20:05:26.397768859 +0000 UTC m=+4.003721971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.910926 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c99f1122ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.187669678 +0000 UTC m=+4.793622780,LastTimestamp:2026-03-19 20:05:27.187669678 +0000 UTC m=+4.793622780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.918264 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9acc5e167 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.417618791 +0000 UTC m=+5.023571893,LastTimestamp:2026-03-19 20:05:27.417618791 +0000 UTC m=+5.023571893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.926003 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9ad77e2c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.429284545 +0000 UTC m=+5.035237657,LastTimestamp:2026-03-19 20:05:27.429284545 +0000 UTC m=+5.035237657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.932580 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9ad95aeca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.431237322 +0000 UTC m=+5.037190424,LastTimestamp:2026-03-19 20:05:27.431237322 +0000 UTC m=+5.037190424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.939301 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9be01ebd1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.706766289 +0000 UTC m=+5.312719391,LastTimestamp:2026-03-19 20:05:27.706766289 +0000 UTC m=+5.312719391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.945821 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9bf02186c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.723554924 +0000 UTC m=+5.329508036,LastTimestamp:2026-03-19 20:05:27.723554924 +0000 UTC m=+5.329508036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.952703 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9bf183339 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.725003577 +0000 UTC m=+5.330956689,LastTimestamp:2026-03-19 20:05:27.725003577 +0000 UTC m=+5.330956689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.959464 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9cdb75b05 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.970315013 +0000 UTC m=+5.576268115,LastTimestamp:2026-03-19 20:05:27.970315013 +0000 UTC m=+5.576268115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.966162 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9ceba39f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.987280373 +0000 UTC m=+5.593233495,LastTimestamp:2026-03-19 20:05:27.987280373 +0000 UTC m=+5.593233495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.972666 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9ced03d98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:27.988723096 +0000 UTC m=+5.594676208,LastTimestamp:2026-03-19 20:05:27.988723096 +0000 UTC m=+5.594676208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.980444 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9deaaae97 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:28.254697111 +0000 UTC m=+5.860650223,LastTimestamp:2026-03-19 20:05:28.254697111 +0000 UTC m=+5.860650223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.987332 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9dfa2b3f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:28.270951411 +0000 UTC m=+5.876904513,LastTimestamp:2026-03-19 20:05:28.270951411 +0000 UTC m=+5.876904513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:17 crc kubenswrapper[4799]: E0319 20:06:17.995165 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9dfbb0c3e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:28.272546878 +0000 UTC m=+5.878499980,LastTimestamp:2026-03-19 20:05:28.272546878 +0000 UTC m=+5.878499980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.001965 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9efd1e220 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:28.54247888 +0000 UTC m=+6.148431992,LastTimestamp:2026-03-19 20:05:28.54247888 +0000 UTC m=+6.148431992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.008625 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e56c9f0c878ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:28.558639306 +0000 UTC m=+6.164592418,LastTimestamp:2026-03-19 20:05:28.558639306 +0000 UTC m=+6.164592418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.018853 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-controller-manager-crc.189e56cb63521e44 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 20:06:18 crc kubenswrapper[4799]: body: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:34.77523002 +0000 UTC m=+12.381183102,LastTimestamp:2026-03-19 20:05:34.77523002 +0000 UTC m=+12.381183102,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.028625 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56cb63530d10 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:34.775291152 +0000 UTC m=+12.381244234,LastTimestamp:2026-03-19 20:05:34.775291152 +0000 UTC m=+12.381244234,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.036783 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-apiserver-crc.189e56cbfdcb6b81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 20:06:18 crc kubenswrapper[4799]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 20:06:18 crc kubenswrapper[4799]: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:37.366870913 +0000 UTC m=+14.972823985,LastTimestamp:2026-03-19 20:05:37.366870913 +0000 UTC m=+14.972823985,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.043657 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56cbfdcc1280 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:37.366913664 +0000 UTC m=+14.972866736,LastTimestamp:2026-03-19 20:05:37.366913664 +0000 UTC m=+14.972866736,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.043938 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.050369 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e56cbfdcb6b81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-apiserver-crc.189e56cbfdcb6b81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 20:06:18 crc kubenswrapper[4799]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 20:06:18 crc kubenswrapper[4799]: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:37.366870913 +0000 UTC m=+14.972823985,LastTimestamp:2026-03-19 20:05:37.374637382 +0000 UTC m=+14.980590444,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.057926 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e56cbfdcc1280\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56cbfdcc1280 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:37.366913664 +0000 UTC m=+14.972866736,LastTimestamp:2026-03-19 20:05:37.374691993 +0000 UTC m=+14.980645065,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.067251 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e56c9599aa642\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c9599aa642 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.022276674 +0000 UTC m=+3.628229756,LastTimestamp:2026-03-19 20:05:38.232599458 +0000 UTC m=+15.838552560,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.075933 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e56c9640f0bed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c9640f0bed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.197677037 +0000 UTC m=+3.803630109,LastTimestamp:2026-03-19 20:05:38.500364387 +0000 UTC m=+16.106317459,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.081369 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e56c964ba1d7b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e56c964ba1d7b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:26.208888187 +0000 UTC m=+3.814841289,LastTimestamp:2026-03-19 20:05:38.507740744 +0000 UTC m=+16.113693816,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.090083 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76c4849 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 20:06:18 crc kubenswrapper[4799]: body: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776165449 +0000 UTC m=+22.382118561,LastTimestamp:2026-03-19 20:05:44.776165449 +0000 UTC m=+22.382118561,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.098246 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76d4881 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776231041 +0000 UTC m=+22.382184143,LastTimestamp:2026-03-19 20:05:44.776231041 +0000 UTC m=+22.382184143,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.105028 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56cdb76c4849\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76c4849 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 20:06:18 crc kubenswrapper[4799]: body: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776165449 +0000 UTC m=+22.382118561,LastTimestamp:2026-03-19 20:05:54.775436067 +0000 UTC m=+32.381389179,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.110965 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56cdb76d4881\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76d4881 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776231041 +0000 UTC m=+22.382184143,LastTimestamp:2026-03-19 20:05:54.77550768 +0000 UTC m=+32.381460782,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.117709 4799 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56d00b9d2748 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:54.778589 +0000 UTC m=+32.384542112,LastTimestamp:2026-03-19 20:05:54.778589 +0000 UTC m=+32.384542112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.124250 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56c8eed4776f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8eed4776f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.230903663 +0000 UTC m=+1.836856735,LastTimestamp:2026-03-19 20:05:54.903128103 +0000 UTC m=+32.509081205,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.131962 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56c8ffe7aa52\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c8ffe7aa52 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.517374546 +0000 UTC m=+2.123327628,LastTimestamp:2026-03-19 20:05:55.152366265 +0000 UTC m=+32.758319377,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.138228 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56c9008d17d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56c9008d17d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:24.528216016 +0000 UTC m=+2.134169098,LastTimestamp:2026-03-19 20:05:55.163879934 +0000 UTC m=+32.769833036,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.148511 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56cdb76c4849\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76c4849 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 20:06:18 crc kubenswrapper[4799]: body: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776165449 +0000 UTC m=+22.382118561,LastTimestamp:2026-03-19 20:06:04.7755422 +0000 UTC m=+42.381495312,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.155822 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56cdb76d4881\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76d4881 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776231041 +0000 UTC m=+22.382184143,LastTimestamp:2026-03-19 20:06:04.775626242 +0000 UTC m=+42.381579354,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.161638 4799 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e56cdb76c4849\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 20:06:18 crc kubenswrapper[4799]: &Event{ObjectMeta:{kube-controller-manager-crc.189e56cdb76c4849 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 20:06:18 crc kubenswrapper[4799]: body: Mar 19 20:06:18 crc kubenswrapper[4799]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:05:44.776165449 +0000 UTC m=+22.382118561,LastTimestamp:2026-03-19 20:06:14.775512297 +0000 UTC m=+52.381465399,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 20:06:18 crc kubenswrapper[4799]: > Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.579166 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.579377 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.580900 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.580942 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.580962 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.798935 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.800734 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.800961 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.801098 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:18 crc kubenswrapper[4799]: I0319 20:06:18.801464 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.809906 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 20:06:18 crc kubenswrapper[4799]: E0319 20:06:18.810215 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 20:06:19 crc kubenswrapper[4799]: I0319 20:06:19.049659 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:20 crc kubenswrapper[4799]: I0319 20:06:20.050590 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:21 crc kubenswrapper[4799]: I0319 20:06:21.048995 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.048997 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.792320 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.792777 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.794293 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.794340 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.794358 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:22 crc kubenswrapper[4799]: I0319 20:06:22.799564 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:06:23 crc kubenswrapper[4799]: I0319 20:06:23.049936 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:23 crc kubenswrapper[4799]: E0319 20:06:23.204758 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:06:23 crc kubenswrapper[4799]: I0319 20:06:23.397292 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:23 crc kubenswrapper[4799]: I0319 20:06:23.398892 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:23 crc kubenswrapper[4799]: I0319 20:06:23.398966 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:23 crc kubenswrapper[4799]: I0319 20:06:23.398990 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:24 crc kubenswrapper[4799]: I0319 20:06:24.049626 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:25 crc kubenswrapper[4799]: I0319 20:06:25.055291 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:25 crc kubenswrapper[4799]: I0319 20:06:25.810594 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:25 crc kubenswrapper[4799]: I0319 20:06:25.811933 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:25 crc kubenswrapper[4799]: I0319 20:06:25.811974 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:25 crc kubenswrapper[4799]: I0319 20:06:25.811988 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:25 crc kubenswrapper[4799]: I0319 20:06:25.812016 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:06:25 crc kubenswrapper[4799]: E0319 20:06:25.815952 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 20:06:25 crc kubenswrapper[4799]: E0319 20:06:25.816206 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 20:06:26 crc kubenswrapper[4799]: I0319 20:06:26.049822 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:27 crc kubenswrapper[4799]: I0319 20:06:27.048144 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:28 crc kubenswrapper[4799]: I0319 20:06:28.046756 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.046744 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.115266 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.116608 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.116744 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.116829 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.117469 4799 scope.go:117] "RemoveContainer" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.432601 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.435071 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0"} Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.435190 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.436227 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.436271 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:29 crc kubenswrapper[4799]: I0319 20:06:29.436286 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.046529 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.445728 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.446083 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.448298 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" exitCode=255 Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.448338 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0"} Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.448376 4799 scope.go:117] "RemoveContainer" containerID="2f9fb74a402c205ba43f395c5cb4b9db7aa2b3e0a03a756e14e4c427442ed254" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.448729 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.449460 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.449565 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.449662 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:30 crc kubenswrapper[4799]: I0319 20:06:30.450577 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:06:30 crc kubenswrapper[4799]: E0319 20:06:30.450856 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:30 crc kubenswrapper[4799]: W0319 20:06:30.744855 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:30 crc kubenswrapper[4799]: E0319 20:06:30.744905 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 20:06:31 crc kubenswrapper[4799]: I0319 20:06:31.049477 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:31 crc kubenswrapper[4799]: I0319 20:06:31.453526 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 20:06:32 crc kubenswrapper[4799]: I0319 20:06:32.049276 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:32 crc kubenswrapper[4799]: I0319 20:06:32.816806 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:32 crc kubenswrapper[4799]: I0319 20:06:32.817825 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:32 crc kubenswrapper[4799]: I0319 20:06:32.817849 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:32 crc kubenswrapper[4799]: I0319 20:06:32.817858 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:32 crc kubenswrapper[4799]: I0319 20:06:32.817877 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:06:32 crc kubenswrapper[4799]: E0319 20:06:32.822255 4799 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 20:06:32 crc kubenswrapper[4799]: E0319 20:06:32.822906 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.049263 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:33 crc kubenswrapper[4799]: E0319 20:06:33.204885 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.646454 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.646645 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.647613 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.647650 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.647662 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:33 crc kubenswrapper[4799]: I0319 20:06:33.648202 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:06:33 crc kubenswrapper[4799]: E0319 20:06:33.648355 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.049368 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.126101 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.396564 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.420436 4799 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.463014 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.464090 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.464145 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.464169 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:34 crc kubenswrapper[4799]: I0319 20:06:34.465044 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:06:34 crc kubenswrapper[4799]: E0319 20:06:34.465348 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:35 crc kubenswrapper[4799]: I0319 20:06:35.049773 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:36 crc kubenswrapper[4799]: I0319 20:06:36.050690 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:37 crc kubenswrapper[4799]: I0319 20:06:37.049567 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:37 crc kubenswrapper[4799]: W0319 20:06:37.348312 4799 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 20:06:37 crc kubenswrapper[4799]: E0319 20:06:37.349544 4799 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 20:06:38 crc kubenswrapper[4799]: I0319 20:06:38.050354 4799 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 20:06:38 crc kubenswrapper[4799]: I0319 20:06:38.090470 4799 csr.go:261] certificate signing request csr-rgmqq is approved, waiting to be issued Mar 19 20:06:38 crc kubenswrapper[4799]: I0319 20:06:38.102015 4799 csr.go:257] certificate signing request csr-rgmqq is issued Mar 19 20:06:38 crc kubenswrapper[4799]: I0319 20:06:38.182642 4799 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 20:06:38 crc kubenswrapper[4799]: I0319 20:06:38.885539 4799 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.104228 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 18:48:24.581182723 +0000 UTC Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.104315 4799 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7318h41m45.476875093s for next certificate rotation Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.823079 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.824956 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.825016 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.825036 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.825235 4799 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.838459 4799 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.838777 4799 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.838826 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.843551 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.843770 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.843952 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.844137 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.844263 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:39Z","lastTransitionTime":"2026-03-19T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.864975 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.876701 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.876774 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.876798 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.876829 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.876852 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:39Z","lastTransitionTime":"2026-03-19T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.892993 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.903231 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.903281 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.903298 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.903321 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.903338 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:39Z","lastTransitionTime":"2026-03-19T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.921932 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.932525 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.932740 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.932858 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.932939 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:39 crc kubenswrapper[4799]: I0319 20:06:39.933043 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:39Z","lastTransitionTime":"2026-03-19T20:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.946909 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.947123 4799 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 20:06:39 crc kubenswrapper[4799]: E0319 20:06:39.947165 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.047995 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.149202 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.249326 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: I0319 20:06:40.333087 4799 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.350602 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.451767 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.551942 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.652512 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.753017 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.853813 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:40 crc kubenswrapper[4799]: E0319 20:06:40.954974 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.056115 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.156951 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.257366 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.358238 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.458607 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.559647 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.660588 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.761486 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.861672 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:41 crc kubenswrapper[4799]: E0319 20:06:41.962046 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.063223 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.164234 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.264685 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.365625 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.465747 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.566127 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.666657 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.767690 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.868929 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:42 crc kubenswrapper[4799]: E0319 20:06:42.970072 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.070410 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.170667 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.205909 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.271481 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.372675 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.473859 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: I0319 20:06:43.535302 4799 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.574654 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.674988 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.775798 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.876764 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:43 crc kubenswrapper[4799]: E0319 20:06:43.977623 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.079470 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.180574 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.281171 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.381993 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.482353 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.583136 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.684241 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.785363 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.885514 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:44 crc kubenswrapper[4799]: E0319 20:06:44.986039 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.087206 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.187683 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.288566 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.389788 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.489978 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.590568 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.691700 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.791997 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.892185 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:45 crc kubenswrapper[4799]: E0319 20:06:45.993164 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.093832 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: I0319 20:06:46.115820 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:46 crc kubenswrapper[4799]: I0319 20:06:46.117755 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:46 crc kubenswrapper[4799]: I0319 20:06:46.117812 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:46 crc kubenswrapper[4799]: I0319 20:06:46.117831 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.194157 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.294815 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.399564 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.500084 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.600682 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.701469 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.802812 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:46 crc kubenswrapper[4799]: E0319 20:06:46.903977 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.004732 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.105652 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.206729 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.307083 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.407304 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.508003 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.608645 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.708734 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.809813 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:47 crc kubenswrapper[4799]: E0319 20:06:47.910006 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.011561 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.111926 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: I0319 20:06:48.115507 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:48 crc kubenswrapper[4799]: I0319 20:06:48.117073 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:48 crc kubenswrapper[4799]: I0319 20:06:48.117124 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:48 crc kubenswrapper[4799]: I0319 20:06:48.117143 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:48 crc kubenswrapper[4799]: I0319 20:06:48.118266 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.118587 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.212651 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.313125 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.413834 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.514327 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.615195 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.716005 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.817116 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:48 crc kubenswrapper[4799]: E0319 20:06:48.918235 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.019108 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.121049 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.222437 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.323133 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.424578 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.525314 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.626700 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.727875 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.828865 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:49 crc kubenswrapper[4799]: E0319 20:06:49.928953 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.001064 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.007076 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.007145 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.007167 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.007190 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.007207 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:50Z","lastTransitionTime":"2026-03-19T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.023847 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.028989 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.029056 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.029073 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.029097 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.029117 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:50Z","lastTransitionTime":"2026-03-19T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.046347 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.051725 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.051971 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.052152 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.052322 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.052449 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:50Z","lastTransitionTime":"2026-03-19T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.063971 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.069064 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.069241 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.069336 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.069493 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:06:50 crc kubenswrapper[4799]: I0319 20:06:50.069597 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:06:50Z","lastTransitionTime":"2026-03-19T20:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.080765 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.081315 4799 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.081448 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.182223 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.283523 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.384516 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.485614 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.586898 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.688070 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.788783 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.889797 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:50 crc kubenswrapper[4799]: E0319 20:06:50.990527 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.091957 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.192851 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.293165 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.393731 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.494742 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.595584 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.696303 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.797053 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.897704 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:51 crc kubenswrapper[4799]: E0319 20:06:51.998549 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.099676 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.200178 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.300714 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.401435 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.502566 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.603700 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.704437 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.804953 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:52 crc kubenswrapper[4799]: E0319 20:06:52.905224 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.006473 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.107296 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: I0319 20:06:53.115728 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:06:53 crc kubenswrapper[4799]: I0319 20:06:53.117211 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:06:53 crc kubenswrapper[4799]: I0319 20:06:53.117243 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:06:53 crc kubenswrapper[4799]: I0319 20:06:53.117261 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.206573 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.207649 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.308404 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.409531 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.510249 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.611076 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.711237 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.812336 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:53 crc kubenswrapper[4799]: E0319 20:06:53.913629 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.013760 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.114125 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.214674 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.316655 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.417286 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.518302 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.618965 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.719891 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.821076 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:54 crc kubenswrapper[4799]: E0319 20:06:54.921439 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.022571 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.123975 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.224531 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.325061 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.426188 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.527313 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.628292 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.729157 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.829818 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:55 crc kubenswrapper[4799]: E0319 20:06:55.930184 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.030980 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.132191 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.233327 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.333997 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.434302 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.534430 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.634708 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.735659 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.836701 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:56 crc kubenswrapper[4799]: E0319 20:06:56.936808 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.037886 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.139133 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.240108 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.340577 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.441247 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.542106 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.642746 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.743829 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.844966 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:57 crc kubenswrapper[4799]: E0319 20:06:57.945467 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.046717 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.147809 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.248087 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.349553 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.449666 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.550767 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.651220 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.752578 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.853626 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:58 crc kubenswrapper[4799]: E0319 20:06:58.953973 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.055361 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.156944 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.257968 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.358686 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.459239 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.559625 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.660738 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.761077 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.862080 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:06:59 crc kubenswrapper[4799]: E0319 20:06:59.962547 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.063211 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.115295 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.116899 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.116967 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.116990 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.118041 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.118337 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.153465 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.158135 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.158188 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.158209 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.158230 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.158246 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:00Z","lastTransitionTime":"2026-03-19T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.172465 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.178061 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.178286 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.178570 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.178782 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.178951 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:00Z","lastTransitionTime":"2026-03-19T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.194597 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.199278 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.199537 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.199698 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.199847 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.199984 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:00Z","lastTransitionTime":"2026-03-19T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.214683 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.218584 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.218643 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.218665 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.218693 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:00 crc kubenswrapper[4799]: I0319 20:07:00.218715 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:00Z","lastTransitionTime":"2026-03-19T20:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.233787 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.234045 4799 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.234094 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.334182 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.434541 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.535536 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.636578 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.737753 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.838268 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:00 crc kubenswrapper[4799]: E0319 20:07:00.938695 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.039194 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.140419 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.241359 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.341798 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.442934 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.543721 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.644768 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.745827 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.845963 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:01 crc kubenswrapper[4799]: E0319 20:07:01.946086 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.046465 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.147556 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.248703 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.348883 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.449050 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.549258 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.649507 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.750219 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.851380 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:02 crc kubenswrapper[4799]: E0319 20:07:02.952219 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.052799 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.153161 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.207830 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.253773 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.354378 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.455449 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.556462 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.656576 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.757021 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.858068 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:03 crc kubenswrapper[4799]: E0319 20:07:03.958735 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.059368 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.160275 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.261423 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.361894 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.462837 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.564424 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.665297 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.766081 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.867198 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:04 crc kubenswrapper[4799]: E0319 20:07:04.967769 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.068464 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.168606 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.269445 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.369579 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.470777 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.571958 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.672620 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.773857 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.874038 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:05 crc kubenswrapper[4799]: E0319 20:07:05.974167 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.075183 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.176077 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.277337 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.378523 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.478849 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.579359 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.680465 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.780825 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.881972 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:06 crc kubenswrapper[4799]: E0319 20:07:06.982477 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.082803 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.183293 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.284407 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.385944 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.486120 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.587057 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.688047 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.789165 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.889756 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:07 crc kubenswrapper[4799]: E0319 20:07:07.990182 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.090929 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.191565 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.292009 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.392722 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.492893 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.593925 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.694676 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.795773 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.896842 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:08 crc kubenswrapper[4799]: E0319 20:07:08.997980 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.098656 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.199114 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.299448 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.400168 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.501208 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.602230 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.702560 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.803045 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:09 crc kubenswrapper[4799]: E0319 20:07:09.904186 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.005250 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.105434 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.206325 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.307338 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.309583 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.313975 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.314020 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.314035 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.314059 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.314075 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:10Z","lastTransitionTime":"2026-03-19T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.329349 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.334131 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.334192 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.334213 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.334241 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.334262 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:10Z","lastTransitionTime":"2026-03-19T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.350922 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.355186 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.355260 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.355289 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.355321 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.355346 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:10Z","lastTransitionTime":"2026-03-19T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.383277 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.403817 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.403864 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.403882 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.403906 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:10 crc kubenswrapper[4799]: I0319 20:07:10.403923 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:10Z","lastTransitionTime":"2026-03-19T20:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.416597 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.416762 4799 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.416798 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.517217 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.617573 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.718504 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.818907 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:10 crc kubenswrapper[4799]: E0319 20:07:10.919946 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.020905 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.116104 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.117792 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.117892 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.117922 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.118897 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.121095 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.221920 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.322869 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.423817 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.524778 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.563334 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.566570 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657"} Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.566762 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.568039 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.568084 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:11 crc kubenswrapper[4799]: I0319 20:07:11.568102 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.625515 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.726454 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.827603 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:11 crc kubenswrapper[4799]: E0319 20:07:11.928498 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.028611 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.129617 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.230661 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.331130 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.432251 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.533247 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.571806 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.572529 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.575357 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" exitCode=255 Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.575454 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657"} Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.575546 4799 scope.go:117] "RemoveContainer" containerID="67c3fb7b85ae44d3e89df8ad9012106cd9f231fbdbbd10727510f4389b82b8b0" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.575694 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.577761 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.577816 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.577835 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:12 crc kubenswrapper[4799]: I0319 20:07:12.578768 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.579084 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.633814 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.734760 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.835917 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:12 crc kubenswrapper[4799]: E0319 20:07:12.936963 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.038150 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.139230 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.208857 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.239562 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.340219 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.440883 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.541956 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.583859 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.642949 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.646179 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.646419 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.648030 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.648088 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.648112 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:13 crc kubenswrapper[4799]: I0319 20:07:13.649199 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.649523 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.743308 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.844086 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:13 crc kubenswrapper[4799]: E0319 20:07:13.945061 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.045171 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: I0319 20:07:14.125706 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.146239 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.247171 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.348205 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.449178 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.549462 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: I0319 20:07:14.588763 4799 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 20:07:14 crc kubenswrapper[4799]: I0319 20:07:14.590162 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:14 crc kubenswrapper[4799]: I0319 20:07:14.590217 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:14 crc kubenswrapper[4799]: I0319 20:07:14.590239 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:14 crc kubenswrapper[4799]: I0319 20:07:14.591220 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.591550 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.649593 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.750568 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.851297 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:14 crc kubenswrapper[4799]: E0319 20:07:14.951516 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.052555 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.153231 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.254350 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.355587 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.456742 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.557752 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.658135 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.759256 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.859934 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:15 crc kubenswrapper[4799]: E0319 20:07:15.961030 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.061426 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.161687 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.262655 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.363053 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.463482 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: I0319 20:07:16.503112 4799 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.564216 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.664978 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.765900 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.866459 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:16 crc kubenswrapper[4799]: E0319 20:07:16.967612 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.068773 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.169369 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.269985 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.370734 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.471721 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.572815 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.673617 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.774427 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.875456 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:17 crc kubenswrapper[4799]: E0319 20:07:17.979417 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.080235 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.181146 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.281566 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.382243 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.483045 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.583834 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.684720 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.785622 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.885735 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:18 crc kubenswrapper[4799]: E0319 20:07:18.986046 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.086226 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.186856 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.287869 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.388626 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.489091 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.589502 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.690682 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.791737 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.892585 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:19 crc kubenswrapper[4799]: E0319 20:07:19.993458 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.094216 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.194324 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.294478 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.395358 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.438214 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.454888 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.454951 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.454975 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.455004 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.455027 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:20Z","lastTransitionTime":"2026-03-19T20:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.472276 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.481371 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.481457 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.481475 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.481497 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.481515 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:20Z","lastTransitionTime":"2026-03-19T20:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.497630 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.502123 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.502177 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.502196 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.502220 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.502238 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:20Z","lastTransitionTime":"2026-03-19T20:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.516183 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.521922 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.521979 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.521996 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.522017 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:20 crc kubenswrapper[4799]: I0319 20:07:20.522034 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:20Z","lastTransitionTime":"2026-03-19T20:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.536459 4799 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e11fefaa-2228-43f6-a9fc-d0587c913216\\\",\\\"systemUUID\\\":\\\"26a95011-54a2-4e43-8b10-a5eb1bfc734c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.536676 4799 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.536725 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.637740 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.737859 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.838791 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:20 crc kubenswrapper[4799]: E0319 20:07:20.939282 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.040434 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.141030 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.241901 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.342927 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.443462 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.544560 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.644819 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.745668 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.846162 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:21 crc kubenswrapper[4799]: E0319 20:07:21.947079 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.048124 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.148947 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.249158 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.350118 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.451194 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.551639 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.652452 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.752634 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.853476 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:22 crc kubenswrapper[4799]: E0319 20:07:22.953942 4799 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 20:07:23 crc kubenswrapper[4799]: E0319 20:07:23.054665 4799 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 20:07:23 crc kubenswrapper[4799]: E0319 20:07:23.209787 4799 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 20:07:23 crc kubenswrapper[4799]: E0319 20:07:23.222343 4799 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 20:07:27 crc kubenswrapper[4799]: I0319 20:07:27.940240 4799 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.098266 4799 apiserver.go:52] "Watching apiserver" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.105850 4799 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.106757 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sndjj","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-multus/multus-hgdvf","openshift-image-registry/node-ca-b4czs","openshift-machine-config-operator/machine-config-daemon-mv84p","openshift-multus/multus-additional-cni-plugins-7c9nh","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-s74jd","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw","openshift-ovn-kubernetes/ovnkube-node-b2bc2","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c"] Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.107801 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.107864 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.108023 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.108559 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.108584 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.108667 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.109220 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.109485 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.109601 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.109677 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.109684 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.109788 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.110481 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.110576 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.113565 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.111092 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.113774 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.113992 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.109670 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.115206 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.122987 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.123025 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.123556 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.123664 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.125300 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.125597 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.125979 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.126173 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.127286 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.127669 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.127983 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.128194 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.128446 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.128909 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.129111 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.129126 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.129439 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.129620 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.129855 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.130786 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.131589 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.134307 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135035 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.134861 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.134991 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135085 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135300 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135220 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135701 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135831 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135904 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.135976 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.136266 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.137024 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.137295 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.149794 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.150373 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.150641 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gxfn\" (UniqueName: \"kubernetes.io/projected/c37c5ae2-a119-4000-8dbc-2121414ca310-kube-api-access-4gxfn\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.150889 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.151122 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b-hosts-file\") pod \"node-resolver-s74jd\" (UID: \"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\") " pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.151344 4799 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.151365 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-cnibin\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.151837 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.152062 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c37c5ae2-a119-4000-8dbc-2121414ca310-host\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.152293 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.152570 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-os-release\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.151203 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.152776 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.153140 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/13325db3-9ac8-4029-b524-5e1be40273be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.153299 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.153536 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.153716 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.153664 4799 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.153885 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13325db3-9ac8-4029-b524-5e1be40273be-cni-binary-copy\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.154309 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668x7\" (UniqueName: \"kubernetes.io/projected/21434d03-6102-44f2-bb92-e1cb4efc47f9-kube-api-access-668x7\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.154582 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c37c5ae2-a119-4000-8dbc-2121414ca310-serviceca\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.154774 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-879qc\" (UniqueName: \"kubernetes.io/projected/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-kube-api-access-879qc\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.154936 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.155092 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-system-cni-dir\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.155255 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lstnw\" (UniqueName: \"kubernetes.io/projected/13325db3-9ac8-4029-b524-5e1be40273be-kube-api-access-lstnw\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.155520 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.155748 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.156000 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.158342 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.160149 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.164757 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.167888 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.178739 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.178781 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.178805 4799 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.178890 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:28.678862824 +0000 UTC m=+126.284815926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.179197 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.183339 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.191828 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.191830 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sndjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21434d03-6102-44f2-bb92-e1cb4efc47f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sndjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.204149 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf986000-80c1-4cf1-8648-d2f7ee370e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mv84p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.223800 4799 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.231509 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c4f2665-70de-4a4f-85d2-c93b098c910a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b2bc2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.245078 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256623 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256702 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256754 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256805 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256851 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256897 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256941 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.256992 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257039 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257084 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257127 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257173 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257183 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257217 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257266 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257315 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257363 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257436 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257458 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257548 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257651 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257766 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257814 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257875 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257920 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.257967 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258011 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258057 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258110 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258118 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258161 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258176 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258184 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258211 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258291 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258351 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258537 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258591 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258626 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258666 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258700 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258742 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258777 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258811 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258414 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259085 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258521 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258558 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258750 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258785 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258784 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.258860 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259333 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259337 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259435 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259454 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259487 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259496 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259644 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259704 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259695 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259744 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259795 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259829 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259864 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259898 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259953 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.259989 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260673 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260747 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260781 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260823 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260862 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260898 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260932 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260966 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260998 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261031 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261066 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261111 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261149 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261183 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261222 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261256 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261292 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261325 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261360 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261420 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261454 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261487 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261518 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261552 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261586 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261656 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261700 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261737 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261771 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261812 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261845 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261879 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261914 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261951 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261990 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262029 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262067 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262100 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262133 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262168 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262625 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262664 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262700 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262776 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262900 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262939 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262974 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260063 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263009 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260693 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.260752 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263044 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263083 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263118 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263158 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263192 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263227 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263263 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263297 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263331 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263366 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263443 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263478 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263512 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263547 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263585 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263642 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263679 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263716 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263751 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263789 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263823 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263857 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263892 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263926 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263959 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263991 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264032 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264068 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264103 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264136 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264167 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264201 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264237 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264286 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264330 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264413 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264466 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264519 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264575 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264628 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264685 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264741 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264798 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264853 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264908 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265175 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264964 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268541 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268583 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268618 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268655 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268690 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268730 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268768 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268806 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268841 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268887 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268921 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268965 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269006 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269048 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269091 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269125 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269160 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269194 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269230 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269269 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269304 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269341 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269375 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269462 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269498 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269536 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269600 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269682 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269731 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269774 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269818 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269861 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269903 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269945 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269990 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273043 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273257 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273309 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273360 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273493 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273806 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.273898 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274168 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274262 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274334 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274443 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274630 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274678 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274718 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274759 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274803 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.274975 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275035 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275111 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-netns\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275305 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/375732b9-7d32-4090-b9d0-f6168107436b-multus-daemon-config\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275365 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-multus-certs\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275643 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cf986000-80c1-4cf1-8648-d2f7ee370e88-rootfs\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275692 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-ovn-kubernetes\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275752 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275801 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.275980 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzc9c\" (UniqueName: \"kubernetes.io/projected/3c4f2665-70de-4a4f-85d2-c93b098c910a-kube-api-access-bzc9c\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276036 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276085 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-cni-bin\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276124 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-kubelet\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276165 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovn-node-metrics-cert\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276283 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276446 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h282\" (UniqueName: \"kubernetes.io/projected/223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b-kube-api-access-6h282\") pod \"node-resolver-s74jd\" (UID: \"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\") " pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276667 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-slash\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277240 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-log-socket\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277329 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-netd\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277450 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-cnibin\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277516 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-k8s-cni-cncf-io\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277558 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-hostroot\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277595 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-conf-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277759 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/13325db3-9ac8-4029-b524-5e1be40273be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277805 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-system-cni-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277875 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.277997 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668x7\" (UniqueName: \"kubernetes.io/projected/21434d03-6102-44f2-bb92-e1cb4efc47f9-kube-api-access-668x7\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278083 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-ovn\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278148 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf986000-80c1-4cf1-8648-d2f7ee370e88-proxy-tls\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278218 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-879qc\" (UniqueName: \"kubernetes.io/projected/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-kube-api-access-879qc\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278351 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-system-cni-dir\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.279560 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-system-cni-dir\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261206 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.280607 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261338 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261596 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.280820 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261991 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262327 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262495 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262372 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262643 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.262988 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.263075 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264224 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264301 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264232 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264356 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264444 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264603 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264636 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264756 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264536 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264912 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.264928 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265091 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265147 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265176 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265745 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265824 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265840 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265889 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.265937 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.266286 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.266436 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.266541 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.266845 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.282912 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.282957 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.283480 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.283518 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.283716 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.283847 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.283972 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.286059 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.284325 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267047 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267109 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267304 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267610 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267835 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267854 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267859 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268081 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268334 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268431 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268491 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268710 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.268771 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269202 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269252 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.269852 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276251 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.276290 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278408 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.278447 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:28.778368198 +0000 UTC m=+126.384321280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.288441 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.288651 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-systemd\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.288822 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-etc-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.289862 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/13325db3-9ac8-4029-b524-5e1be40273be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.290815 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-systemd-units\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.291000 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-cni-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.290917 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.291056 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.290833 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.291990 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292088 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292232 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.285816 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292311 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-os-release\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292455 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292528 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxfn\" (UniqueName: \"kubernetes.io/projected/c37c5ae2-a119-4000-8dbc-2121414ca310-kube-api-access-4gxfn\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292621 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292680 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-script-lib\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292734 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf986000-80c1-4cf1-8648-d2f7ee370e88-mcd-auth-proxy-config\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292804 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88h9d\" (UniqueName: \"kubernetes.io/projected/375732b9-7d32-4090-b9d0-f6168107436b-kube-api-access-88h9d\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292813 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292859 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-bin\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292909 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/375732b9-7d32-4090-b9d0-f6168107436b-cni-binary-copy\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.292951 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-cni-multus\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293019 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-cnibin\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293074 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b-hosts-file\") pod \"node-resolver-s74jd\" (UID: \"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\") " pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293153 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c37c5ae2-a119-4000-8dbc-2121414ca310-host\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293194 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-os-release\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293179 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293250 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-config\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293319 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293371 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-node-log\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293453 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293504 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293446 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293556 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-var-lib-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293619 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13325db3-9ac8-4029-b524-5e1be40273be-cni-binary-copy\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293664 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-kubelet\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293745 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-env-overrides\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293793 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-socket-dir-parent\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293847 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293892 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-netns\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293976 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4859g\" (UniqueName: \"kubernetes.io/projected/cf986000-80c1-4cf1-8648-d2f7ee370e88-kube-api-access-4859g\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294019 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-etc-kubernetes\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294079 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c37c5ae2-a119-4000-8dbc-2121414ca310-serviceca\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294148 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lstnw\" (UniqueName: \"kubernetes.io/projected/13325db3-9ac8-4029-b524-5e1be40273be-kube-api-access-lstnw\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294532 4799 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294566 4799 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294601 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294623 4799 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294648 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294672 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294705 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294729 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294756 4799 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294789 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294813 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294834 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295055 4799 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295086 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295109 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295133 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295158 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295188 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295214 4799 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295237 4799 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295265 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295302 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295324 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295347 4799 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295414 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295447 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295475 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295523 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295569 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295604 4799 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295633 4799 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295666 4799 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295710 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295693 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s74jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h282\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s74jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295755 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295790 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295832 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295862 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296099 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296207 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-cnibin\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296216 4799 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296314 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296352 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296417 4799 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293558 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296667 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296674 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296903 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b-hosts-file\") pod \"node-resolver-s74jd\" (UID: \"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\") " pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296938 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c37c5ae2-a119-4000-8dbc-2121414ca310-host\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297014 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13325db3-9ac8-4029-b524-5e1be40273be-os-release\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297044 4799 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297065 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297088 4799 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297109 4799 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297126 4799 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297142 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297165 4799 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297178 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297199 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297613 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.297929 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298215 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298521 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298598 4799 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298739 4799 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298771 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298799 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298831 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298856 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298902 4799 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298925 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298954 4799 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.298976 4799 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299009 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299032 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299062 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299085 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299107 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299141 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299216 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299934 4799 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299963 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.300048 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.300124 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.300233 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.301655 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13325db3-9ac8-4029-b524-5e1be40273be-cni-binary-copy\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299290 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299410 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.302836 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.303037 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.303851 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.304038 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.304072 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c37c5ae2-a119-4000-8dbc-2121414ca310-serviceca\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.304301 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.304679 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.304719 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.305362 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.305764 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.305796 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.305817 4799 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.305843 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.313697 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.314237 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.314752 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.317423 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.279770 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278871 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.279968 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.280224 4799 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.280506 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.280637 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.261906 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.284946 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.285317 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.285437 4799 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.285936 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.286740 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.287559 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.291130 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.278641 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.291266 4799 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.291705 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293197 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293211 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293726 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293889 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.293951 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.294773 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295045 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295070 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.267034 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295142 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295473 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295486 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295620 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.295727 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.296336 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.299961 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.300360 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.318634 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.319714 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668x7\" (UniqueName: \"kubernetes.io/projected/21434d03-6102-44f2-bb92-e1cb4efc47f9-kube-api-access-668x7\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.320510 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.321775 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7zfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.321905 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gxfn\" (UniqueName: \"kubernetes.io/projected/c37c5ae2-a119-4000-8dbc-2121414ca310-kube-api-access-4gxfn\") pod \"node-ca-b4czs\" (UID: \"c37c5ae2-a119-4000-8dbc-2121414ca310\") " pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.322801 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.322927 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.322959 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.322984 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323008 4799 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323028 4799 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323052 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323085 4799 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323119 4799 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323113 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323146 4799 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323182 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323228 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323256 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323369 4799 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323440 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323508 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323553 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323581 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323599 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.323714 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:28.823666693 +0000 UTC m=+126.429619775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.323922 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:28.823906761 +0000 UTC m=+126.429859843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.323950 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.323983 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs podName:21434d03-6102-44f2-bb92-e1cb4efc47f9 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:28.823974083 +0000 UTC m=+126.429927175 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs") pod "network-metrics-daemon-sndjj" (UID: "21434d03-6102-44f2-bb92-e1cb4efc47f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.324060 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:28.824045645 +0000 UTC m=+126.429998727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.324318 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.324406 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.324601 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.325998 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.324476 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-879qc\" (UniqueName: \"kubernetes.io/projected/6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a-kube-api-access-879qc\") pod \"ovnkube-control-plane-749d76644c-v7zfw\" (UID: \"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.326758 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.326765 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.326904 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.327195 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.327415 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lstnw\" (UniqueName: \"kubernetes.io/projected/13325db3-9ac8-4029-b524-5e1be40273be-kube-api-access-lstnw\") pod \"multus-additional-cni-plugins-7c9nh\" (UID: \"13325db3-9ac8-4029-b524-5e1be40273be\") " pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.327764 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.328174 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.328024 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.328421 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.329610 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.334623 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.334731 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.334749 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.334824 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335061 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335210 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335302 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335354 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.329691 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.329769 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.329801 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.329837 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.329859 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.330901 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.331025 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.331321 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.331449 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.331680 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335743 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335824 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.336092 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.336204 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.336279 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.336782 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.335535 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.337971 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.337981 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.338694 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.339248 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.339708 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.340223 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.337868 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.340991 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.346443 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.352659 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.353314 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.355249 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13325db3-9ac8-4029-b524-5e1be40273be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7c9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.367196 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.368874 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.375413 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b4czs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c37c5ae2-a119-4000-8dbc-2121414ca310\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gxfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b4czs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.386860 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hgdvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"375732b9-7d32-4090-b9d0-f6168107436b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88h9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hgdvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.399709 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424661 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-config\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424718 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-var-lib-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424743 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-node-log\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424770 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424792 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-env-overrides\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424816 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-socket-dir-parent\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424852 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-kubelet\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424872 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-node-log\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424880 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-etc-kubernetes\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424924 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-etc-kubernetes\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.424989 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-netns\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425023 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-kubelet\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425026 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4859g\" (UniqueName: \"kubernetes.io/projected/cf986000-80c1-4cf1-8648-d2f7ee370e88-kube-api-access-4859g\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425040 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-socket-dir-parent\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425097 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-netns\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425073 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-netns\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425239 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/375732b9-7d32-4090-b9d0-f6168107436b-multus-daemon-config\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425289 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-multus-certs\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425289 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425161 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-var-lib-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425349 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-netns\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425361 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-ovn-kubernetes\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425435 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-multus-certs\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425454 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cf986000-80c1-4cf1-8648-d2f7ee370e88-rootfs\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425486 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-ovn-kubernetes\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425517 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425554 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzc9c\" (UniqueName: \"kubernetes.io/projected/3c4f2665-70de-4a4f-85d2-c93b098c910a-kube-api-access-bzc9c\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425588 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425625 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-cni-bin\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425658 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-kubelet\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425693 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h282\" (UniqueName: \"kubernetes.io/projected/223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b-kube-api-access-6h282\") pod \"node-resolver-s74jd\" (UID: \"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\") " pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425737 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovn-node-metrics-cert\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425805 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-log-socket\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425840 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-netd\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425872 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-cnibin\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425877 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-kubelet\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425922 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-slash\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-conf-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426020 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-cni-bin\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.425521 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cf986000-80c1-4cf1-8648-d2f7ee370e88-rootfs\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426082 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-system-cni-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426149 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-k8s-cni-cncf-io\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426181 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-hostroot\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426214 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-ovn\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426249 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf986000-80c1-4cf1-8648-d2f7ee370e88-proxy-tls\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426307 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-systemd\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426338 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-etc-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426416 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-systemd-units\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426447 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-cni-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426453 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-env-overrides\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426481 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-os-release\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426518 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf986000-80c1-4cf1-8648-d2f7ee370e88-mcd-auth-proxy-config\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426528 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-log-socket\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426554 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88h9d\" (UniqueName: \"kubernetes.io/projected/375732b9-7d32-4090-b9d0-f6168107436b-kube-api-access-88h9d\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426578 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-systemd\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426586 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-bin\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426621 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426645 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-system-cni-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426656 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-script-lib\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426647 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-cnibin\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426691 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/375732b9-7d32-4090-b9d0-f6168107436b-cni-binary-copy\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426714 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-run-k8s-cni-cncf-io\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426718 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/375732b9-7d32-4090-b9d0-f6168107436b-multus-daemon-config\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426759 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-hostroot\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426722 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-cni-multus\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426779 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-host-var-lib-cni-multus\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426825 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-netd\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426891 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-systemd-units\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426894 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-ovn\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426922 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-etc-openvswitch\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426941 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.426984 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-slash\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427023 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427040 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-conf-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427177 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-bin\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427544 4799 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427558 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-os-release\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427576 4799 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427596 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427614 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427630 4799 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427645 4799 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427660 4799 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427676 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427695 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427712 4799 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427742 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427760 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427776 4799 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427791 4799 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427806 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427822 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427839 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427856 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427874 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427892 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427910 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427926 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427942 4799 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427958 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427973 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.427988 4799 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428004 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428018 4799 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428034 4799 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428049 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428066 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428082 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428097 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428112 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428129 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428145 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428149 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375732b9-7d32-4090-b9d0-f6168107436b-multus-cni-dir\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428163 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428209 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428229 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428247 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428264 4799 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428281 4799 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428299 4799 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428316 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428332 4799 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428348 4799 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428367 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428408 4799 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428425 4799 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428440 4799 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428453 4799 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428464 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428477 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428488 4799 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428499 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428511 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428522 4799 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428536 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428548 4799 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428561 4799 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428573 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428591 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428720 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428737 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428749 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428761 4799 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428793 4799 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428809 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428820 4799 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428832 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428843 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428855 4799 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.428865 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429075 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429092 4799 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429104 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429117 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429128 4799 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429139 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429343 4799 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429358 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429454 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429476 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429488 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429500 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429511 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429523 4799 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429533 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429544 4799 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429556 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429567 4799 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429578 4799 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429589 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429600 4799 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429611 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429622 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429634 4799 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429645 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429655 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429668 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429678 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429689 4799 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.429739 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-config\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.430818 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf986000-80c1-4cf1-8648-d2f7ee370e88-proxy-tls\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.431068 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf986000-80c1-4cf1-8648-d2f7ee370e88-mcd-auth-proxy-config\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.432276 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/375732b9-7d32-4090-b9d0-f6168107436b-cni-binary-copy\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.433150 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-script-lib\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.435471 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovn-node-metrics-cert\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.445625 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.446652 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h282\" (UniqueName: \"kubernetes.io/projected/223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b-kube-api-access-6h282\") pod \"node-resolver-s74jd\" (UID: \"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\") " pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.451701 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzc9c\" (UniqueName: \"kubernetes.io/projected/3c4f2665-70de-4a4f-85d2-c93b098c910a-kube-api-access-bzc9c\") pod \"ovnkube-node-b2bc2\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.456060 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88h9d\" (UniqueName: \"kubernetes.io/projected/375732b9-7d32-4090-b9d0-f6168107436b-kube-api-access-88h9d\") pod \"multus-hgdvf\" (UID: \"375732b9-7d32-4090-b9d0-f6168107436b\") " pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.458671 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4859g\" (UniqueName: \"kubernetes.io/projected/cf986000-80c1-4cf1-8648-d2f7ee370e88-kube-api-access-4859g\") pod \"machine-config-daemon-mv84p\" (UID: \"cf986000-80c1-4cf1-8648-d2f7ee370e88\") " pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.465723 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cb50c5a5839cb108db68590412dca79c5bce9ac2ab4b43a3e21bb34f6cee905b WatchSource:0}: Error finding container cb50c5a5839cb108db68590412dca79c5bce9ac2ab4b43a3e21bb34f6cee905b: Status 404 returned error can't find the container with id cb50c5a5839cb108db68590412dca79c5bce9ac2ab4b43a3e21bb34f6cee905b Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.487608 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s74jd" Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.502758 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223f0ded_7a2e_477c_b7fe_22eb1c2c9c8b.slice/crio-63c940074575e1273b8ea7bfd512e63b2cf430738eb9b90cf6382fe116fab52c WatchSource:0}: Error finding container 63c940074575e1273b8ea7bfd512e63b2cf430738eb9b90cf6382fe116fab52c: Status 404 returned error can't find the container with id 63c940074575e1273b8ea7bfd512e63b2cf430738eb9b90cf6382fe116fab52c Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.525162 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b4czs" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.534069 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.546248 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc37c5ae2_a119_4000_8dbc_2121414ca310.slice/crio-eadc456d3a72bf7eafe35849cec24f1e0d148adab3197544d4f2c4ed8f8df689 WatchSource:0}: Error finding container eadc456d3a72bf7eafe35849cec24f1e0d148adab3197544d4f2c4ed8f8df689: Status 404 returned error can't find the container with id eadc456d3a72bf7eafe35849cec24f1e0d148adab3197544d4f2c4ed8f8df689 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.546714 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.555265 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.567847 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hgdvf" Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.573027 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c4f2665_70de_4a4f_85d2_c93b098c910a.slice/crio-c24142686369c05044686930e821e2347fba48e9764de3de0dd6fd9935f1bc64 WatchSource:0}: Error finding container c24142686369c05044686930e821e2347fba48e9764de3de0dd6fd9935f1bc64: Status 404 returned error can't find the container with id c24142686369c05044686930e821e2347fba48e9764de3de0dd6fd9935f1bc64 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.575965 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.589838 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13325db3_9ac8_4029_b524_5e1be40273be.slice/crio-5c3a7f396c2134149a1e1f11b03ab1a03e8b91e76cf24fe32107aed1df4026c3 WatchSource:0}: Error finding container 5c3a7f396c2134149a1e1f11b03ab1a03e8b91e76cf24fe32107aed1df4026c3: Status 404 returned error can't find the container with id 5c3a7f396c2134149a1e1f11b03ab1a03e8b91e76cf24fe32107aed1df4026c3 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.596567 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.622269 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-93fa54178df224c24826cb8eefbdf520ca8a805cec5ce3caf720a63298678580 WatchSource:0}: Error finding container 93fa54178df224c24826cb8eefbdf520ca8a805cec5ce3caf720a63298678580: Status 404 returned error can't find the container with id 93fa54178df224c24826cb8eefbdf520ca8a805cec5ce3caf720a63298678580 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.632430 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b4czs" event={"ID":"c37c5ae2-a119-4000-8dbc-2121414ca310","Type":"ContainerStarted","Data":"eadc456d3a72bf7eafe35849cec24f1e0d148adab3197544d4f2c4ed8f8df689"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.633935 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s74jd" event={"ID":"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b","Type":"ContainerStarted","Data":"63c940074575e1273b8ea7bfd512e63b2cf430738eb9b90cf6382fe116fab52c"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.635147 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"93fa54178df224c24826cb8eefbdf520ca8a805cec5ce3caf720a63298678580"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.636607 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgdvf" event={"ID":"375732b9-7d32-4090-b9d0-f6168107436b","Type":"ContainerStarted","Data":"2e9ed2fa7fbb9ddc0f18736bbfdfcf526102985afff7defc559df4bcd0dfaff6"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.639542 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cb50c5a5839cb108db68590412dca79c5bce9ac2ab4b43a3e21bb34f6cee905b"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.641673 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerStarted","Data":"5c3a7f396c2134149a1e1f11b03ab1a03e8b91e76cf24fe32107aed1df4026c3"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.643379 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"c24142686369c05044686930e821e2347fba48e9764de3de0dd6fd9935f1bc64"} Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.644593 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" event={"ID":"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a","Type":"ContainerStarted","Data":"508fb098ab6561b6ad068815a9dfcb0697184c44476e41ddd882a47247236e8b"} Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.687652 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d6fda190f9c0425795ad68d057e710110ca39a50eb6bb9f4b36267d921e5db72 WatchSource:0}: Error finding container d6fda190f9c0425795ad68d057e710110ca39a50eb6bb9f4b36267d921e5db72: Status 404 returned error can't find the container with id d6fda190f9c0425795ad68d057e710110ca39a50eb6bb9f4b36267d921e5db72 Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.732455 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.732647 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.732669 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.732681 4799 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.732787 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:29.732712792 +0000 UTC m=+127.338665864 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.754756 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.833200 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833431 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:29.833398556 +0000 UTC m=+127.439351638 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.833491 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.833597 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.833636 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:28 crc kubenswrapper[4799]: I0319 20:07:28.833678 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833680 4799 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833761 4799 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833800 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833817 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:29.83380965 +0000 UTC m=+127.439762722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833826 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833841 4799 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833894 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:29.833877902 +0000 UTC m=+127.439831064 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833917 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs podName:21434d03-6102-44f2-bb92-e1cb4efc47f9 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:29.833907773 +0000 UTC m=+127.439860975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs") pod "network-metrics-daemon-sndjj" (UID: "21434d03-6102-44f2-bb92-e1cb4efc47f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.833982 4799 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: E0319 20:07:28.834066 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:29.834043258 +0000 UTC m=+127.439996370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:28 crc kubenswrapper[4799]: W0319 20:07:28.841714 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf986000_80c1_4cf1_8648_d2f7ee370e88.slice/crio-f6beb6e738870009b2d09d62f11ea356787f3c5d4e9de23f1e8690b843cacab2 WatchSource:0}: Error finding container f6beb6e738870009b2d09d62f11ea356787f3c5d4e9de23f1e8690b843cacab2: Status 404 returned error can't find the container with id f6beb6e738870009b2d09d62f11ea356787f3c5d4e9de23f1e8690b843cacab2 Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.123538 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.124995 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.127498 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.128893 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.131145 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.132367 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.134002 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.135624 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.136016 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.136668 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.138292 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.139509 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.140739 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.142521 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.143685 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.146017 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.147302 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.148588 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.150140 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.151149 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.152478 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.153731 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.154729 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.155978 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.156731 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.157719 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.158451 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.160105 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.161633 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.162360 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.163305 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.164735 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.165491 4799 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.165728 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.168619 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.169503 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.170765 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.173846 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.174959 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.175781 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.178012 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.180420 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.181587 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.182853 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.185077 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.187221 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.188515 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.190921 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.192302 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.194203 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.194921 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.196426 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.197144 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.197989 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.199487 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.200139 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.200770 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.651681 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" exitCode=0 Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.651796 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.655774 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b4czs" event={"ID":"c37c5ae2-a119-4000-8dbc-2121414ca310","Type":"ContainerStarted","Data":"935562efad3b52840924f3b0565735450d7ef8f7819e875218e364212e45d10a"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.657480 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d6fda190f9c0425795ad68d057e710110ca39a50eb6bb9f4b36267d921e5db72"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.659802 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgdvf" event={"ID":"375732b9-7d32-4090-b9d0-f6168107436b","Type":"ContainerStarted","Data":"4e9b2a9be48f6d2a71959464f4aa0703755e22d46cc402516abccafb85ef1d93"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.662918 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e5776470436d3d45830cc6d65a8b273785b0763dff4b41c21246ee22449f5b2a"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.663002 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab596c4a1f0163838e16805860253d4aeba7cf29c8243585c3a1935bc7c42faa"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.667226 4799 generic.go:334] "Generic (PLEG): container finished" podID="13325db3-9ac8-4029-b524-5e1be40273be" containerID="c6772f36e1218da82af8098f67e834f645d4ff49b3260925c644a63aa14e622d" exitCode=0 Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.667345 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerDied","Data":"c6772f36e1218da82af8098f67e834f645d4ff49b3260925c644a63aa14e622d"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.670836 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1a50006fa79ef3f648f6807a5864ca2ad70a39211053d9714764a3e0e66523e0"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.674119 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" event={"ID":"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a","Type":"ContainerStarted","Data":"7a99fd6ea1d4c30a6211579f73bf4566fac2127c9377520174bfb1e1b59e5813"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.674178 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" event={"ID":"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a","Type":"ContainerStarted","Data":"edab48cf47a0aff8ab4f6a19c6b34507b7e3298a2ae0cec470b672914ba5b2b5"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.677224 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s74jd" event={"ID":"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b","Type":"ContainerStarted","Data":"4c6a14d73c6874092405ef50df7a7bafa4391540621e2e002081722e7fbca6e1"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.680874 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.681005 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"72076d7127d8bf9751a8c218b0ac47c74f7de18279db443c9c110c50429bf1d2"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.682206 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s74jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h282\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s74jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.682335 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427"} Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.682417 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"f6beb6e738870009b2d09d62f11ea356787f3c5d4e9de23f1e8690b843cacab2"} Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.682981 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.705989 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7zfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.742797 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c4f2665-70de-4a4f-85d2-c93b098c910a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b2bc2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.744677 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.745662 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.745889 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.745907 4799 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.746995 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:31.74693117 +0000 UTC m=+129.352884282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.759007 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.779482 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.792545 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.804909 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13325db3-9ac8-4029-b524-5e1be40273be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7c9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.824556 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.842610 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.845586 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.845705 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.845749 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.845801 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.845928 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.846021 4799 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.846120 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:31.846087652 +0000 UTC m=+129.452040764 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.846787 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:31.846767145 +0000 UTC m=+129.452720257 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.846827 4799 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.846868 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:31.846854748 +0000 UTC m=+129.452807820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.847117 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.847134 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.847143 4799 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.847166 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:31.847159498 +0000 UTC m=+129.453112570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.847599 4799 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: E0319 20:07:29.847628 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs podName:21434d03-6102-44f2-bb92-e1cb4efc47f9 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:31.847620614 +0000 UTC m=+129.453573686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs") pod "network-metrics-daemon-sndjj" (UID: "21434d03-6102-44f2-bb92-e1cb4efc47f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.853448 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b4czs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c37c5ae2-a119-4000-8dbc-2121414ca310\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gxfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b4czs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.879731 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hgdvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"375732b9-7d32-4090-b9d0-f6168107436b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88h9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hgdvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.893309 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2940076-1ae1-4544-8060-faba015730bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T20:07:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 20:07:12.059524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 20:07:12.059629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 20:07:12.060496 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1394040153/tls.crt::/tmp/serving-cert-1394040153/tls.key\\\\\\\"\\\\nI0319 20:07:12.378615 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 20:07:12.383601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 20:07:12.383638 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 20:07:12.383673 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 20:07:12.383685 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 20:07:12.390688 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 20:07:12.390717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 20:07:12.390738 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 20:07:12.390751 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 20:07:12.390761 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 20:07:12.390768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 20:07:12.390774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 20:07:12.390780 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 20:07:12.393055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T20:07:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T20:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T20:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.908633 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sndjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21434d03-6102-44f2-bb92-e1cb4efc47f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sndjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.942023 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf986000-80c1-4cf1-8648-d2f7ee370e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mv84p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.957110 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.969741 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2940076-1ae1-4544-8060-faba015730bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T20:07:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 20:07:12.059524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 20:07:12.059629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 20:07:12.060496 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1394040153/tls.crt::/tmp/serving-cert-1394040153/tls.key\\\\\\\"\\\\nI0319 20:07:12.378615 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 20:07:12.383601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 20:07:12.383638 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 20:07:12.383673 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 20:07:12.383685 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 20:07:12.390688 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 20:07:12.390717 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 20:07:12.390738 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 20:07:12.390751 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 20:07:12.390761 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 20:07:12.390768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 20:07:12.390774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 20:07:12.390780 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 20:07:12.393055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T20:07:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T20:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T20:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.982668 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:29 crc kubenswrapper[4799]: I0319 20:07:29.995550 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5776470436d3d45830cc6d65a8b273785b0763dff4b41c21246ee22449f5b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab596c4a1f0163838e16805860253d4aeba7cf29c8243585c3a1935bc7c42faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:29Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.005738 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b4czs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c37c5ae2-a119-4000-8dbc-2121414ca310\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://935562efad3b52840924f3b0565735450d7ef8f7819e875218e364212e45d10a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gxfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b4czs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.017745 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hgdvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"375732b9-7d32-4090-b9d0-f6168107436b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e9b2a9be48f6d2a71959464f4aa0703755e22d46cc402516abccafb85ef1d93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88h9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hgdvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.030749 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a50006fa79ef3f648f6807a5864ca2ad70a39211053d9714764a3e0e66523e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.040455 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sndjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21434d03-6102-44f2-bb92-e1cb4efc47f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sndjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.051457 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf986000-80c1-4cf1-8648-d2f7ee370e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72076d7127d8bf9751a8c218b0ac47c74f7de18279db443c9c110c50429bf1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mv84p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.069506 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.080588 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s74jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6a14d73c6874092405ef50df7a7bafa4391540621e2e002081722e7fbca6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h282\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s74jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.093447 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edab48cf47a0aff8ab4f6a19c6b34507b7e3298a2ae0cec470b672914ba5b2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a99fd6ea1d4c30a6211579f73bf4566fac2127c9377520174bfb1e1b59e5813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7zfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.111801 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c4f2665-70de-4a4f-85d2-c93b098c910a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzc9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-b2bc2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.115285 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:30 crc kubenswrapper[4799]: E0319 20:07:30.115396 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.115658 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:30 crc kubenswrapper[4799]: E0319 20:07:30.115719 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.115766 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:30 crc kubenswrapper[4799]: E0319 20:07:30.115814 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.115858 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:30 crc kubenswrapper[4799]: E0319 20:07:30.115913 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.123921 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.135361 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.153847 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13325db3-9ac8-4029-b524-5e1be40273be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6772f36e1218da82af8098f67e834f645d4ff49b3260925c644a63aa14e622d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6772f36e1218da82af8098f67e834f645d4ff49b3260925c644a63aa14e622d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lstnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7c9nh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.697016 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.697462 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.697472 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.697481 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.697490 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.700326 4799 generic.go:334] "Generic (PLEG): container finished" podID="13325db3-9ac8-4029-b524-5e1be40273be" containerID="e565d3d809c940b612a721009211e55ea29e5f4067d9514929a575dc423fcfb4" exitCode=0 Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.701219 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerDied","Data":"e565d3d809c940b612a721009211e55ea29e5f4067d9514929a575dc423fcfb4"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.715645 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a50006fa79ef3f648f6807a5864ca2ad70a39211053d9714764a3e0e66523e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.726047 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sndjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21434d03-6102-44f2-bb92-e1cb4efc47f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-668x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sndjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.737786 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf986000-80c1-4cf1-8648-d2f7ee370e88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72076d7127d8bf9751a8c218b0ac47c74f7de18279db443c9c110c50429bf1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4859g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mv84p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.755520 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.776364 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s74jd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223f0ded-7a2e-477c-b7fe-22eb1c2c9c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c6a14d73c6874092405ef50df7a7bafa4391540621e2e002081722e7fbca6e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6h282\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s74jd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.791529 4799 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a929bcf-9b4e-4f87-be65-0bcd53d0cf5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T20:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://edab48cf47a0aff8ab4f6a19c6b34507b7e3298a2ae0cec470b672914ba5b2b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a99fd6ea1d4c30a6211579f73bf4566fac2127c9377520174bfb1e1b59e5813\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T20:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-879qc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T20:07:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v7zfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T20:07:30Z is after 2025-08-24T17:21:41Z" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.868736 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.868782 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.868794 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.868812 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.868824 4799 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T20:07:30Z","lastTransitionTime":"2026-03-19T20:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.916441 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6"] Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.918105 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.920472 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.921272 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.921418 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.921492 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.959443 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b4czs" podStartSLOduration=64.9594247 podStartE2EDuration="1m4.9594247s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:30.944887394 +0000 UTC m=+128.550840466" watchObservedRunningTime="2026-03-19 20:07:30.9594247 +0000 UTC m=+128.565377772" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.972291 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v7zfw" podStartSLOduration=64.972280088 podStartE2EDuration="1m4.972280088s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:30.972023629 +0000 UTC m=+128.577976701" watchObservedRunningTime="2026-03-19 20:07:30.972280088 +0000 UTC m=+128.578233160" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.972511 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hgdvf" podStartSLOduration=64.972507936 podStartE2EDuration="1m4.972507936s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:30.959904626 +0000 UTC m=+128.565857698" watchObservedRunningTime="2026-03-19 20:07:30.972507936 +0000 UTC m=+128.578461008" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.991289 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s74jd" podStartSLOduration=64.991272186 podStartE2EDuration="1m4.991272186s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:30.990445087 +0000 UTC m=+128.596398179" watchObservedRunningTime="2026-03-19 20:07:30.991272186 +0000 UTC m=+128.597225258" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.991322 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca4f81a-4533-4191-bc27-b67103d8db39-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.991405 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7ca4f81a-4533-4191-bc27-b67103d8db39-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.991444 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7ca4f81a-4533-4191-bc27-b67103d8db39-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.991564 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ca4f81a-4533-4191-bc27-b67103d8db39-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:30 crc kubenswrapper[4799]: I0319 20:07:30.991636 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca4f81a-4533-4191-bc27-b67103d8db39-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.023272 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podStartSLOduration=65.023245076 podStartE2EDuration="1m5.023245076s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:31.011465194 +0000 UTC m=+128.617418276" watchObservedRunningTime="2026-03-19 20:07:31.023245076 +0000 UTC m=+128.629198158" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.092867 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ca4f81a-4533-4191-bc27-b67103d8db39-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.092911 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca4f81a-4533-4191-bc27-b67103d8db39-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.092947 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca4f81a-4533-4191-bc27-b67103d8db39-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.092978 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7ca4f81a-4533-4191-bc27-b67103d8db39-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.093005 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7ca4f81a-4533-4191-bc27-b67103d8db39-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.093071 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7ca4f81a-4533-4191-bc27-b67103d8db39-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.093804 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ca4f81a-4533-4191-bc27-b67103d8db39-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.093869 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7ca4f81a-4533-4191-bc27-b67103d8db39-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.107254 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ca4f81a-4533-4191-bc27-b67103d8db39-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.112597 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ca4f81a-4533-4191-bc27-b67103d8db39-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ps2d6\" (UID: \"7ca4f81a-4533-4191-bc27-b67103d8db39\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.125796 4799 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.137083 4799 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.230158 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" Mar 19 20:07:31 crc kubenswrapper[4799]: W0319 20:07:31.283577 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca4f81a_4533_4191_bc27_b67103d8db39.slice/crio-7e6db16fc14b764bd6d2c350f57e8431a97c46d578e2e03745da74ff5c97d1be WatchSource:0}: Error finding container 7e6db16fc14b764bd6d2c350f57e8431a97c46d578e2e03745da74ff5c97d1be: Status 404 returned error can't find the container with id 7e6db16fc14b764bd6d2c350f57e8431a97c46d578e2e03745da74ff5c97d1be Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.710050 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" event={"ID":"7ca4f81a-4533-4191-bc27-b67103d8db39","Type":"ContainerStarted","Data":"3d228d6064603a7cc4363b743dccc66b0787bca16fb5e6e7a8bb09f24098c2cf"} Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.710128 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" event={"ID":"7ca4f81a-4533-4191-bc27-b67103d8db39","Type":"ContainerStarted","Data":"7e6db16fc14b764bd6d2c350f57e8431a97c46d578e2e03745da74ff5c97d1be"} Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.714707 4799 generic.go:334] "Generic (PLEG): container finished" podID="13325db3-9ac8-4029-b524-5e1be40273be" containerID="ce738560474d04b7e53fdf5c647817ad3e688351cb923e7b4986afd629ac3e06" exitCode=0 Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.714884 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerDied","Data":"ce738560474d04b7e53fdf5c647817ad3e688351cb923e7b4986afd629ac3e06"} Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.717965 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8e2b12da957feb44f3735fc40a9bf98e9b74bfc08ad9e88b612779083ae0b26c"} Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.726790 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.773196 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ps2d6" podStartSLOduration=65.77317534 podStartE2EDuration="1m5.77317534s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:31.735980781 +0000 UTC m=+129.341933923" watchObservedRunningTime="2026-03-19 20:07:31.77317534 +0000 UTC m=+129.379128452" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.801466 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.801732 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.801766 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.801790 4799 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.801872 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:35.801843078 +0000 UTC m=+133.407796190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.902613 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.902692 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.902716 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.902735 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:31 crc kubenswrapper[4799]: I0319 20:07:31.902763 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.902915 4799 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.902969 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs podName:21434d03-6102-44f2-bb92-e1cb4efc47f9 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:35.902953016 +0000 UTC m=+133.508906088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs") pod "network-metrics-daemon-sndjj" (UID: "21434d03-6102-44f2-bb92-e1cb4efc47f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903024 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:35.903017018 +0000 UTC m=+133.508970090 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903077 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903087 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903096 4799 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903119 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:35.903112341 +0000 UTC m=+133.509065413 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903235 4799 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903258 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:35.903251026 +0000 UTC m=+133.509204098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903286 4799 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:31 crc kubenswrapper[4799]: E0319 20:07:31.903304 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:35.903297577 +0000 UTC m=+133.509250649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:32 crc kubenswrapper[4799]: I0319 20:07:32.115766 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:32 crc kubenswrapper[4799]: E0319 20:07:32.116155 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:32 crc kubenswrapper[4799]: I0319 20:07:32.115932 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:32 crc kubenswrapper[4799]: I0319 20:07:32.115817 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:32 crc kubenswrapper[4799]: E0319 20:07:32.116246 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:32 crc kubenswrapper[4799]: I0319 20:07:32.115979 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:32 crc kubenswrapper[4799]: E0319 20:07:32.116443 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:32 crc kubenswrapper[4799]: E0319 20:07:32.116596 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:32 crc kubenswrapper[4799]: I0319 20:07:32.735097 4799 generic.go:334] "Generic (PLEG): container finished" podID="13325db3-9ac8-4029-b524-5e1be40273be" containerID="2d477a87384389ffa11901e448264307b91f7a68ef9d754127674016d86111e0" exitCode=0 Mar 19 20:07:32 crc kubenswrapper[4799]: I0319 20:07:32.735175 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerDied","Data":"2d477a87384389ffa11901e448264307b91f7a68ef9d754127674016d86111e0"} Mar 19 20:07:33 crc kubenswrapper[4799]: E0319 20:07:33.224623 4799 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 20:07:33 crc kubenswrapper[4799]: I0319 20:07:33.743044 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} Mar 19 20:07:33 crc kubenswrapper[4799]: I0319 20:07:33.747323 4799 generic.go:334] "Generic (PLEG): container finished" podID="13325db3-9ac8-4029-b524-5e1be40273be" containerID="45b94b3f918cdf30a28cc46a57533924cf5a556ea422a628c3166f789b2c092a" exitCode=0 Mar 19 20:07:33 crc kubenswrapper[4799]: I0319 20:07:33.747416 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerDied","Data":"45b94b3f918cdf30a28cc46a57533924cf5a556ea422a628c3166f789b2c092a"} Mar 19 20:07:34 crc kubenswrapper[4799]: I0319 20:07:34.115289 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:34 crc kubenswrapper[4799]: I0319 20:07:34.115335 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:34 crc kubenswrapper[4799]: I0319 20:07:34.115415 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:34 crc kubenswrapper[4799]: I0319 20:07:34.115308 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:34 crc kubenswrapper[4799]: E0319 20:07:34.115501 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:34 crc kubenswrapper[4799]: E0319 20:07:34.115681 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:34 crc kubenswrapper[4799]: E0319 20:07:34.115763 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:34 crc kubenswrapper[4799]: E0319 20:07:34.115856 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:34 crc kubenswrapper[4799]: I0319 20:07:34.756718 4799 generic.go:334] "Generic (PLEG): container finished" podID="13325db3-9ac8-4029-b524-5e1be40273be" containerID="1387c8d19f3148d7175cb3a6407062390668fdbe082c59e86c98a0d82c2292d9" exitCode=0 Mar 19 20:07:34 crc kubenswrapper[4799]: I0319 20:07:34.756778 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerDied","Data":"1387c8d19f3148d7175cb3a6407062390668fdbe082c59e86c98a0d82c2292d9"} Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.765277 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerStarted","Data":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.765747 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.769759 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" event={"ID":"13325db3-9ac8-4029-b524-5e1be40273be","Type":"ContainerStarted","Data":"2ae568465dcc37b435245b33aa00142f21d01d30635e8a5f18445c883f7b8327"} Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.800044 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podStartSLOduration=69.800012301 podStartE2EDuration="1m9.800012301s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:35.798819953 +0000 UTC m=+133.404773055" watchObservedRunningTime="2026-03-19 20:07:35.800012301 +0000 UTC m=+133.405965423" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.807731 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.826577 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7c9nh" podStartSLOduration=69.826551815 podStartE2EDuration="1m9.826551815s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:35.826273939 +0000 UTC m=+133.432227021" watchObservedRunningTime="2026-03-19 20:07:35.826551815 +0000 UTC m=+133.432504927" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.847895 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.848085 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.848111 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.848133 4799 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.848202 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:43.848180136 +0000 UTC m=+141.454133238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.949372 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.949607 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.949655 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.949703 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:35 crc kubenswrapper[4799]: I0319 20:07:35.949769 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949874 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949876 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:43.949832121 +0000 UTC m=+141.555785233 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949907 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949934 4799 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949951 4799 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949955 4799 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.950004 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:43.949980804 +0000 UTC m=+141.555933966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.949894 4799 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.950060 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs podName:21434d03-6102-44f2-bb92-e1cb4efc47f9 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:43.950034266 +0000 UTC m=+141.555987398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs") pod "network-metrics-daemon-sndjj" (UID: "21434d03-6102-44f2-bb92-e1cb4efc47f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.950095 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:43.950079797 +0000 UTC m=+141.556033009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:35 crc kubenswrapper[4799]: E0319 20:07:35.950160 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:43.950106337 +0000 UTC m=+141.556059539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.115463 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.115485 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.115512 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.115478 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:36 crc kubenswrapper[4799]: E0319 20:07:36.115587 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:36 crc kubenswrapper[4799]: E0319 20:07:36.115653 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:36 crc kubenswrapper[4799]: E0319 20:07:36.115760 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:36 crc kubenswrapper[4799]: E0319 20:07:36.115859 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.783560 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.783654 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:36 crc kubenswrapper[4799]: I0319 20:07:36.820018 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:37 crc kubenswrapper[4799]: I0319 20:07:37.581665 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sndjj"] Mar 19 20:07:37 crc kubenswrapper[4799]: I0319 20:07:37.582124 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:37 crc kubenswrapper[4799]: E0319 20:07:37.582267 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:38 crc kubenswrapper[4799]: I0319 20:07:38.116015 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:38 crc kubenswrapper[4799]: E0319 20:07:38.116188 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:38 crc kubenswrapper[4799]: I0319 20:07:38.116718 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:38 crc kubenswrapper[4799]: E0319 20:07:38.116821 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:38 crc kubenswrapper[4799]: I0319 20:07:38.116881 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:38 crc kubenswrapper[4799]: E0319 20:07:38.116943 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:38 crc kubenswrapper[4799]: E0319 20:07:38.226170 4799 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 20:07:39 crc kubenswrapper[4799]: I0319 20:07:39.115254 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:39 crc kubenswrapper[4799]: E0319 20:07:39.115549 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:40 crc kubenswrapper[4799]: I0319 20:07:40.115638 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:40 crc kubenswrapper[4799]: I0319 20:07:40.115726 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:40 crc kubenswrapper[4799]: E0319 20:07:40.115783 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:40 crc kubenswrapper[4799]: I0319 20:07:40.115804 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:40 crc kubenswrapper[4799]: E0319 20:07:40.115958 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:40 crc kubenswrapper[4799]: E0319 20:07:40.116097 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:41 crc kubenswrapper[4799]: I0319 20:07:41.115753 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:41 crc kubenswrapper[4799]: E0319 20:07:41.115941 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:42 crc kubenswrapper[4799]: I0319 20:07:42.115090 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:42 crc kubenswrapper[4799]: I0319 20:07:42.115159 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:42 crc kubenswrapper[4799]: I0319 20:07:42.115298 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:42 crc kubenswrapper[4799]: E0319 20:07:42.115493 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 20:07:42 crc kubenswrapper[4799]: E0319 20:07:42.115603 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 20:07:42 crc kubenswrapper[4799]: E0319 20:07:42.115740 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 20:07:42 crc kubenswrapper[4799]: I0319 20:07:42.115760 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:42 crc kubenswrapper[4799]: E0319 20:07:42.115962 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.116091 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.117602 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sndjj" podUID="21434d03-6102-44f2-bb92-e1cb4efc47f9" Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.875265 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.875554 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.875632 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.875654 4799 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.875788 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.875760619 +0000 UTC m=+157.481713731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.976423 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.976539 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.976574 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976673 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.976631396 +0000 UTC m=+157.582584518 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976681 4799 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.976713 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:43 crc kubenswrapper[4799]: I0319 20:07:43.976755 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976755 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976806 4799 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976819 4799 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976834 4799 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.976777 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.976755699 +0000 UTC m=+157.582708811 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.977031 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.976968424 +0000 UTC m=+157.582921566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.977091 4799 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.977100 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.977077116 +0000 UTC m=+157.583030298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 20:07:43 crc kubenswrapper[4799]: E0319 20:07:43.977230 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs podName:21434d03-6102-44f2-bb92-e1cb4efc47f9 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.977202339 +0000 UTC m=+157.583155491 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs") pod "network-metrics-daemon-sndjj" (UID: "21434d03-6102-44f2-bb92-e1cb4efc47f9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.115147 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.115190 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.115167 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.117591 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.117873 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.118212 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.118589 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 20:07:44 crc kubenswrapper[4799]: I0319 20:07:44.132667 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 20:07:45 crc kubenswrapper[4799]: I0319 20:07:45.115197 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:07:45 crc kubenswrapper[4799]: I0319 20:07:45.118720 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 20:07:45 crc kubenswrapper[4799]: I0319 20:07:45.118832 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 20:07:50 crc kubenswrapper[4799]: I0319 20:07:50.132219 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.482828 4799 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.536670 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nthhp"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.537584 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.540849 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6ddkj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.541557 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.543806 4799 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.543860 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.544428 4799 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.544477 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.544693 4799 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.544729 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.544998 4799 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.545032 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.545105 4799 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.545125 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.545179 4799 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.545198 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.545316 4799 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.545341 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.545464 4799 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.545489 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.545563 4799 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.545698 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.546318 4799 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.546355 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.546550 4799 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.546582 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.546827 4799 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.546859 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.550967 4799 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.551023 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.551077 4799 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.551096 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: W0319 20:07:51.551263 4799 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 19 20:07:51 crc kubenswrapper[4799]: E0319 20:07:51.551285 4799 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.552085 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.552391 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.557008 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.557732 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.561971 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.564188 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-942vf"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.564900 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.565034 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zxq6v"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.567490 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.567777 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.573717 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.573770 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.574071 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.574078 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.574372 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.574460 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.574751 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j5t7c"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.574985 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.575176 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.575243 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.575370 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.575702 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.575774 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.575854 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.576143 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.576377 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.576659 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.576687 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.576767 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.576836 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.577138 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.577210 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.578005 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wjwdj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.580249 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.597475 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wnspm"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.598550 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lq2lw"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.598958 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.599046 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.600109 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.609844 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.610205 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.610357 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.612582 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.612681 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.612916 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.613052 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.613182 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.613650 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-49fwh"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.613685 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.613859 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614103 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614153 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614503 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614570 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614102 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614838 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614938 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.615029 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6ddkj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.615052 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.615150 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614171 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614236 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614243 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614281 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614328 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614846 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614363 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614443 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614469 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614498 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614649 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614751 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.614771 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.615945 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.616366 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.616487 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nthhp"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.616780 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.616892 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.617832 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.617979 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.618235 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.618455 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.619432 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-942vf"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.619473 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.620296 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zxq6v"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.620540 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.621164 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.621334 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.621888 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.621993 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.625695 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.625898 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.626017 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.626725 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.626923 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zxk4t"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.627171 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.627223 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.627390 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wjn4f"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.627549 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.627924 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.628028 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.628521 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.628550 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.629068 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663447 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-encryption-config\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663483 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b363a50c-90bb-42cf-8129-2b5672a86812-audit-dir\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663506 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-config\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663525 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hslzg\" (UniqueName: \"kubernetes.io/projected/0b71b47b-d667-49c1-ae5b-3326bcde5508-kube-api-access-hslzg\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663547 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbq8\" (UniqueName: \"kubernetes.io/projected/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-kube-api-access-nhbq8\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663579 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-serving-cert\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663599 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663627 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663645 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit-dir\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663666 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-serving-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663684 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663703 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b71b47b-d667-49c1-ae5b-3326bcde5508-images\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663723 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlqrg\" (UniqueName: \"kubernetes.io/projected/b363a50c-90bb-42cf-8129-2b5672a86812-kube-api-access-vlqrg\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663743 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05f6d19-514d-4d86-8c9b-31606b794551-serving-cert\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663760 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663780 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663799 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663819 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-node-pullsecrets\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663851 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663868 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-serving-cert\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663889 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663906 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x996k\" (UniqueName: \"kubernetes.io/projected/b05f6d19-514d-4d86-8c9b-31606b794551-kube-api-access-x996k\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663925 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-encryption-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663946 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663970 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b71b47b-d667-49c1-ae5b-3326bcde5508-config\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.663991 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-service-ca-bundle\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.664010 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-audit-policies\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.664030 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.664048 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-etcd-client\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.664361 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2xv74"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.665162 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.665756 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hqtdl"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.665795 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.666665 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.666934 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.672710 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.677827 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.678143 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.678244 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.678399 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.678531 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.678984 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.679264 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.679600 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680789 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680801 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680849 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681820 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680881 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680923 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680954 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.680979 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681009 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681064 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681112 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681149 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681235 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681672 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681780 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.681936 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.682823 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.683204 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.690557 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.692269 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.699498 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.700072 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.700247 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.700931 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.701408 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.702157 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.702516 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.704929 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.710107 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x5czz"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.712934 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.713668 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.713897 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.720130 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.722719 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.723252 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.729273 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mtvrz"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.729926 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.737746 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-dzmpr"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.738418 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.739305 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.739764 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.740137 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.742182 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.745443 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.745540 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.745755 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbgk7"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.746038 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.746160 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.746313 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.747038 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.748942 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j5t7c"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.748983 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.749068 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.749543 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.749816 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.751750 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.754481 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wjn4f"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.754532 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gksq8"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.755621 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.755688 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.756544 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.759369 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.765509 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.766072 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x5czz"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.766924 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-config\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.766956 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.766982 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-default-certificate\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767007 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/31337dcf-6a02-4e29-a768-15c7077c99d7-srv-cert\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767029 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767049 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767072 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767099 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9tcx\" (UniqueName: \"kubernetes.io/projected/18d60a3e-2409-4136-b0f2-f46a3860cc9f-kube-api-access-p9tcx\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767120 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767161 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-service-ca-bundle\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767186 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2bxb\" (UniqueName: \"kubernetes.io/projected/343a3122-a4be-4c67-bef4-22cd0e482cea-kube-api-access-x2bxb\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767238 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00dca6c-4b7a-4e1a-978b-44e97cba491a-trusted-ca\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.767829 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-service-ca-bundle\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768664 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-audit-policies\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768734 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768764 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75926aed-864d-42ee-aabf-89e5579606a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768791 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-metrics-certs\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768817 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c676b337-d700-423f-9fdc-d82946d7e0c4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5dnp8\" (UID: \"c676b337-d700-423f-9fdc-d82946d7e0c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768844 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5j4\" (UniqueName: \"kubernetes.io/projected/f61ce38b-72a4-41b0-9416-7956d3cc1133-kube-api-access-6r5j4\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768867 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-etcd-client\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768892 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/eb2fa492-7952-4155-9f40-bb6f9c569a4f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768915 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-service-ca\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768935 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mz8h\" (UniqueName: \"kubernetes.io/projected/31337dcf-6a02-4e29-a768-15c7077c99d7-kube-api-access-2mz8h\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768957 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f61ce38b-72a4-41b0-9416-7956d3cc1133-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.768976 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxm44\" (UniqueName: \"kubernetes.io/projected/e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d-kube-api-access-gxm44\") pod \"downloads-7954f5f757-49fwh\" (UID: \"e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d\") " pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769012 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5fd\" (UniqueName: \"kubernetes.io/projected/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-kube-api-access-mm5fd\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769044 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00dca6c-4b7a-4e1a-978b-44e97cba491a-serving-cert\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769067 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769086 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769108 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6627dc01-4b17-4951-ac65-b51d037f5215-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769150 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-encryption-config\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769167 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b363a50c-90bb-42cf-8129-2b5672a86812-audit-dir\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769186 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-config\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769206 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-serving-cert\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769227 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lds6j\" (UniqueName: \"kubernetes.io/projected/75926aed-864d-42ee-aabf-89e5579606a7-kube-api-access-lds6j\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769244 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00dca6c-4b7a-4e1a-978b-44e97cba491a-config\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769264 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgj9w\" (UniqueName: \"kubernetes.io/projected/e7a739cc-cb77-45dc-9811-661046ccf05b-kube-api-access-tgj9w\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769285 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hslzg\" (UniqueName: \"kubernetes.io/projected/0b71b47b-d667-49c1-ae5b-3326bcde5508-kube-api-access-hslzg\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769306 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274f0b01-a045-405f-80e8-f278a87a97ce-audit-dir\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769329 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbq8\" (UniqueName: \"kubernetes.io/projected/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-kube-api-access-nhbq8\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769353 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vfsj\" (UniqueName: \"kubernetes.io/projected/1bb0b885-60b4-4211-8c18-8d759ff37ace-kube-api-access-9vfsj\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769424 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-auth-proxy-config\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769451 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-serving-cert\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769453 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ttwrh"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769483 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae84a622-64a2-408d-9275-b3ece41df758-service-ca-bundle\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769506 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769529 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c17ad003-bca3-46f4-8227-1d59764cf084-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769548 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769568 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-client-ca\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769590 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmr8c\" (UniqueName: \"kubernetes.io/projected/5e87733e-0b51-49dc-b43c-74479aa30aa2-kube-api-access-zmr8c\") pod \"dns-operator-744455d44c-2xv74\" (UID: \"5e87733e-0b51-49dc-b43c-74479aa30aa2\") " pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769609 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-audit-policies\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769628 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28d29c0-1b6f-41bc-b103-016c2ee40e42-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769645 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d29c0-1b6f-41bc-b103-016c2ee40e42-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769695 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit-dir\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769716 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769745 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lks7v\" (UniqueName: \"kubernetes.io/projected/eb2fa492-7952-4155-9f40-bb6f9c569a4f-kube-api-access-lks7v\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769808 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-oauth-config\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769826 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6627dc01-4b17-4951-ac65-b51d037f5215-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769845 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw46t\" (UniqueName: \"kubernetes.io/projected/c17ad003-bca3-46f4-8227-1d59764cf084-kube-api-access-kw46t\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769865 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18d60a3e-2409-4136-b0f2-f46a3860cc9f-signing-cabundle\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769889 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-serving-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769907 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769926 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6627dc01-4b17-4951-ac65-b51d037f5215-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769945 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsltk\" (UniqueName: \"kubernetes.io/projected/6627dc01-4b17-4951-ac65-b51d037f5215-kube-api-access-lsltk\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769966 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtxr\" (UniqueName: \"kubernetes.io/projected/fcc1241a-6888-4bcc-b784-474bc8865f63-kube-api-access-gjtxr\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.769991 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dd97bd4-11f1-48a2-ba74-2eba33de161b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-drzzt\" (UID: \"8dd97bd4-11f1-48a2-ba74-2eba33de161b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770017 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770048 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcw79\" (UniqueName: \"kubernetes.io/projected/274f0b01-a045-405f-80e8-f278a87a97ce-kube-api-access-pcw79\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770084 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b71b47b-d667-49c1-ae5b-3326bcde5508-images\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770115 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770148 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlqrg\" (UniqueName: \"kubernetes.io/projected/b363a50c-90bb-42cf-8129-2b5672a86812-kube-api-access-vlqrg\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770173 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05f6d19-514d-4d86-8c9b-31606b794551-serving-cert\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770197 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770217 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc1241a-6888-4bcc-b784-474bc8865f63-serving-cert\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770243 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770263 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770284 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb0b885-60b4-4211-8c18-8d759ff37ace-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770303 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6h4t\" (UniqueName: \"kubernetes.io/projected/c676b337-d700-423f-9fdc-d82946d7e0c4-kube-api-access-d6h4t\") pod \"cluster-samples-operator-665b6dd947-5dnp8\" (UID: \"c676b337-d700-423f-9fdc-d82946d7e0c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770327 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-node-pullsecrets\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770347 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770371 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18d60a3e-2409-4136-b0f2-f46a3860cc9f-signing-key\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770418 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/31337dcf-6a02-4e29-a768-15c7077c99d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770441 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770460 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61ce38b-72a4-41b0-9416-7956d3cc1133-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770479 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770500 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-console-config\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770525 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-config\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770545 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-machine-approver-tls\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770562 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e87733e-0b51-49dc-b43c-74479aa30aa2-metrics-tls\") pod \"dns-operator-744455d44c-2xv74\" (UID: \"5e87733e-0b51-49dc-b43c-74479aa30aa2\") " pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770582 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhd46\" (UniqueName: \"kubernetes.io/projected/d00dca6c-4b7a-4e1a-978b-44e97cba491a-kube-api-access-nhd46\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770614 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770618 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-audit-policies\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770631 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-client-ca\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770767 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770816 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-serving-cert\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.770852 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb2fa492-7952-4155-9f40-bb6f9c569a4f-serving-cert\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771077 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5db\" (UniqueName: \"kubernetes.io/projected/8dd97bd4-11f1-48a2-ba74-2eba33de161b-kube-api-access-wn5db\") pod \"control-plane-machine-set-operator-78cbb6b69f-drzzt\" (UID: \"8dd97bd4-11f1-48a2-ba74-2eba33de161b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771131 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771177 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771227 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x996k\" (UniqueName: \"kubernetes.io/projected/b05f6d19-514d-4d86-8c9b-31606b794551-kube-api-access-x996k\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771266 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-encryption-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771312 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-trusted-ca-bundle\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771366 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-config\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771435 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17ad003-bca3-46f4-8227-1d59764cf084-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771479 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771519 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28d29c0-1b6f-41bc-b103-016c2ee40e42-config\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771668 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771704 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hqtdl"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.771839 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.773332 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mtvrz"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.774957 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.777087 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b363a50c-90bb-42cf-8129-2b5672a86812-audit-dir\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.777613 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit-dir\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.777748 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-config\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.778136 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-node-pullsecrets\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.778163 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b71b47b-d667-49c1-ae5b-3326bcde5508-images\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.778170 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.778826 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b363a50c-90bb-42cf-8129-2b5672a86812-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.778800 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb0b885-60b4-4211-8c18-8d759ff37ace-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.778889 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-stats-auth\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.780491 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05f6d19-514d-4d86-8c9b-31606b794551-serving-cert\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.780829 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-encryption-config\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781158 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wnspm"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781213 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wnm\" (UniqueName: \"kubernetes.io/projected/ae84a622-64a2-408d-9275-b3ece41df758-kube-api-access-44wnm\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781265 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781354 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b71b47b-d667-49c1-ae5b-3326bcde5508-config\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781420 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-oauth-serving-cert\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781434 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-serving-cert\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781475 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b363a50c-90bb-42cf-8129-2b5672a86812-etcd-client\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.781844 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.782014 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b71b47b-d667-49c1-ae5b-3326bcde5508-config\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.782082 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lq2lw"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.782822 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f6d19-514d-4d86-8c9b-31606b794551-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.782978 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-49fwh"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.784242 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wjwdj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.785112 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2xv74"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.786334 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.787221 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.788406 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.789628 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.790429 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.791559 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.792401 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.793339 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.794326 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbgk7"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.795482 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.797162 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.799078 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.799207 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.800132 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.801559 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gksq8"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.802461 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ttwrh"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.803642 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9rsf6"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.804285 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.804562 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lcd4j"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.805095 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.805576 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9rsf6"] Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.819627 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.840274 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.859775 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.880163 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882318 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6627dc01-4b17-4951-ac65-b51d037f5215-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882364 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsltk\" (UniqueName: \"kubernetes.io/projected/6627dc01-4b17-4951-ac65-b51d037f5215-kube-api-access-lsltk\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882423 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtxr\" (UniqueName: \"kubernetes.io/projected/fcc1241a-6888-4bcc-b784-474bc8865f63-kube-api-access-gjtxr\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882456 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dd97bd4-11f1-48a2-ba74-2eba33de161b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-drzzt\" (UID: \"8dd97bd4-11f1-48a2-ba74-2eba33de161b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882489 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882522 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcw79\" (UniqueName: \"kubernetes.io/projected/274f0b01-a045-405f-80e8-f278a87a97ce-kube-api-access-pcw79\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882561 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882618 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc1241a-6888-4bcc-b784-474bc8865f63-serving-cert\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882655 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a06124f0-b975-4b73-b58d-f678af8cda26-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882694 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6h4t\" (UniqueName: \"kubernetes.io/projected/c676b337-d700-423f-9fdc-d82946d7e0c4-kube-api-access-d6h4t\") pod \"cluster-samples-operator-665b6dd947-5dnp8\" (UID: \"c676b337-d700-423f-9fdc-d82946d7e0c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882749 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb0b885-60b4-4211-8c18-8d759ff37ace-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882787 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/31337dcf-6a02-4e29-a768-15c7077c99d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882821 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882854 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18d60a3e-2409-4136-b0f2-f46a3860cc9f-signing-key\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882895 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61ce38b-72a4-41b0-9416-7956d3cc1133-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882938 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.882979 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhd46\" (UniqueName: \"kubernetes.io/projected/d00dca6c-4b7a-4e1a-978b-44e97cba491a-kube-api-access-nhd46\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883040 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-console-config\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883071 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-config\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883106 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-machine-approver-tls\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883142 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e87733e-0b51-49dc-b43c-74479aa30aa2-metrics-tls\") pod \"dns-operator-744455d44c-2xv74\" (UID: \"5e87733e-0b51-49dc-b43c-74479aa30aa2\") " pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883177 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-client-ca\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883209 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883254 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb2fa492-7952-4155-9f40-bb6f9c569a4f-serving-cert\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883322 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5db\" (UniqueName: \"kubernetes.io/projected/8dd97bd4-11f1-48a2-ba74-2eba33de161b-kube-api-access-wn5db\") pod \"control-plane-machine-set-operator-78cbb6b69f-drzzt\" (UID: \"8dd97bd4-11f1-48a2-ba74-2eba33de161b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883354 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883387 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883453 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-trusted-ca-bundle\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883489 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-config\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883524 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17ad003-bca3-46f4-8227-1d59764cf084-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883560 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28d29c0-1b6f-41bc-b103-016c2ee40e42-config\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883577 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883596 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883697 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb0b885-60b4-4211-8c18-8d759ff37ace-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883738 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-stats-auth\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883778 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wnm\" (UniqueName: \"kubernetes.io/projected/ae84a622-64a2-408d-9275-b3ece41df758-kube-api-access-44wnm\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883811 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-oauth-serving-cert\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883858 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-config\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883890 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883928 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-default-certificate\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.883968 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/31337dcf-6a02-4e29-a768-15c7077c99d7-srv-cert\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884001 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884036 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884058 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61ce38b-72a4-41b0-9416-7956d3cc1133-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884069 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884134 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9tcx\" (UniqueName: \"kubernetes.io/projected/18d60a3e-2409-4136-b0f2-f46a3860cc9f-kube-api-access-p9tcx\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884173 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884210 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2bxb\" (UniqueName: \"kubernetes.io/projected/343a3122-a4be-4c67-bef4-22cd0e482cea-kube-api-access-x2bxb\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884240 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00dca6c-4b7a-4e1a-978b-44e97cba491a-trusted-ca\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884274 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75926aed-864d-42ee-aabf-89e5579606a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884322 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5j4\" (UniqueName: \"kubernetes.io/projected/f61ce38b-72a4-41b0-9416-7956d3cc1133-kube-api-access-6r5j4\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884440 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-metrics-certs\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884649 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c676b337-d700-423f-9fdc-d82946d7e0c4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5dnp8\" (UID: \"c676b337-d700-423f-9fdc-d82946d7e0c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884700 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f61ce38b-72a4-41b0-9416-7956d3cc1133-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884725 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-console-config\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884739 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/eb2fa492-7952-4155-9f40-bb6f9c569a4f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884773 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-service-ca\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884805 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mz8h\" (UniqueName: \"kubernetes.io/projected/31337dcf-6a02-4e29-a768-15c7077c99d7-kube-api-access-2mz8h\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884836 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5fd\" (UniqueName: \"kubernetes.io/projected/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-kube-api-access-mm5fd\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884892 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxm44\" (UniqueName: \"kubernetes.io/projected/e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d-kube-api-access-gxm44\") pod \"downloads-7954f5f757-49fwh\" (UID: \"e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d\") " pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884932 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.884972 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00dca6c-4b7a-4e1a-978b-44e97cba491a-serving-cert\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885046 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885113 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6627dc01-4b17-4951-ac65-b51d037f5215-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885168 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a06124f0-b975-4b73-b58d-f678af8cda26-ready\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885205 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-serving-cert\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885264 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgj9w\" (UniqueName: \"kubernetes.io/projected/e7a739cc-cb77-45dc-9811-661046ccf05b-kube-api-access-tgj9w\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885300 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lds6j\" (UniqueName: \"kubernetes.io/projected/75926aed-864d-42ee-aabf-89e5579606a7-kube-api-access-lds6j\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885565 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00dca6c-4b7a-4e1a-978b-44e97cba491a-config\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885648 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-config\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885651 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-trusted-ca-bundle\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885672 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274f0b01-a045-405f-80e8-f278a87a97ce-audit-dir\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885771 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vfsj\" (UniqueName: \"kubernetes.io/projected/1bb0b885-60b4-4211-8c18-8d759ff37ace-kube-api-access-9vfsj\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885869 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-auth-proxy-config\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885923 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae84a622-64a2-408d-9275-b3ece41df758-service-ca-bundle\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.885957 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.886014 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwbg7\" (UniqueName: \"kubernetes.io/projected/a06124f0-b975-4b73-b58d-f678af8cda26-kube-api-access-hwbg7\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.886175 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bb0b885-60b4-4211-8c18-8d759ff37ace-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.886340 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d00dca6c-4b7a-4e1a-978b-44e97cba491a-trusted-ca\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.886416 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.886941 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17ad003-bca3-46f4-8227-1d59764cf084-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.887359 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c17ad003-bca3-46f4-8227-1d59764cf084-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.887799 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-oauth-serving-cert\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.887848 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d29c0-1b6f-41bc-b103-016c2ee40e42-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.888173 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-client-ca\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.888308 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-auth-proxy-config\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.887896 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-client-ca\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.888425 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmr8c\" (UniqueName: \"kubernetes.io/projected/5e87733e-0b51-49dc-b43c-74479aa30aa2-kube-api-access-zmr8c\") pod \"dns-operator-744455d44c-2xv74\" (UID: \"5e87733e-0b51-49dc-b43c-74479aa30aa2\") " pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.888466 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-audit-policies\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.888473 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274f0b01-a045-405f-80e8-f278a87a97ce-audit-dir\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.888501 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28d29c0-1b6f-41bc-b103-016c2ee40e42-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.889226 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28d29c0-1b6f-41bc-b103-016c2ee40e42-config\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.889499 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-machine-approver-tls\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.889847 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-client-ca\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.890062 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-config\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.890427 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-service-ca\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.890521 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.890752 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.890766 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-config\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891013 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lks7v\" (UniqueName: \"kubernetes.io/projected/eb2fa492-7952-4155-9f40-bb6f9c569a4f-kube-api-access-lks7v\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891059 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c17ad003-bca3-46f4-8227-1d59764cf084-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891268 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-oauth-config\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891353 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6627dc01-4b17-4951-ac65-b51d037f5215-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891482 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw46t\" (UniqueName: \"kubernetes.io/projected/c17ad003-bca3-46f4-8227-1d59764cf084-kube-api-access-kw46t\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891628 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18d60a3e-2409-4136-b0f2-f46a3860cc9f-signing-cabundle\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891825 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d00dca6c-4b7a-4e1a-978b-44e97cba491a-serving-cert\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892091 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb2fa492-7952-4155-9f40-bb6f9c569a4f-serving-cert\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892388 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892450 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bb0b885-60b4-4211-8c18-8d759ff37ace-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892549 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dd97bd4-11f1-48a2-ba74-2eba33de161b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-drzzt\" (UID: \"8dd97bd4-11f1-48a2-ba74-2eba33de161b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892667 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00dca6c-4b7a-4e1a-978b-44e97cba491a-config\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892714 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.892710 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c676b337-d700-423f-9fdc-d82946d7e0c4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5dnp8\" (UID: \"c676b337-d700-423f-9fdc-d82946d7e0c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.891778 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.893153 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-default-certificate\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.893228 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.893705 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75926aed-864d-42ee-aabf-89e5579606a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.893862 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc1241a-6888-4bcc-b784-474bc8865f63-serving-cert\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.894144 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-audit-policies\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.894451 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f61ce38b-72a4-41b0-9416-7956d3cc1133-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.894650 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/eb2fa492-7952-4155-9f40-bb6f9c569a4f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.895224 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.895758 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-oauth-config\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.895810 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.896012 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.896487 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-serving-cert\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.897769 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.897840 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28d29c0-1b6f-41bc-b103-016c2ee40e42-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.898773 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.899672 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.930956 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.934331 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-stats-auth\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.939183 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae84a622-64a2-408d-9275-b3ece41df758-service-ca-bundle\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.939568 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.942030 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae84a622-64a2-408d-9275-b3ece41df758-metrics-certs\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.959797 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.979831 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.994068 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a06124f0-b975-4b73-b58d-f678af8cda26-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.994185 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a06124f0-b975-4b73-b58d-f678af8cda26-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.994481 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.994547 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a06124f0-b975-4b73-b58d-f678af8cda26-ready\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.994673 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwbg7\" (UniqueName: \"kubernetes.io/projected/a06124f0-b975-4b73-b58d-f678af8cda26-kube-api-access-hwbg7\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:51 crc kubenswrapper[4799]: I0319 20:07:51.995908 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a06124f0-b975-4b73-b58d-f678af8cda26-ready\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.009657 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.017344 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.019603 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.040094 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.044772 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.059814 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.079551 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.091006 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/31337dcf-6a02-4e29-a768-15c7077c99d7-srv-cert\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.101100 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.119171 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.129981 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/31337dcf-6a02-4e29-a768-15c7077c99d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.140549 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.159781 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.187795 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.194695 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6627dc01-4b17-4951-ac65-b51d037f5215-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.199569 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.219366 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.227856 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6627dc01-4b17-4951-ac65-b51d037f5215-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.241206 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.260887 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.266503 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.281435 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.299905 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.310827 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e87733e-0b51-49dc-b43c-74479aa30aa2-metrics-tls\") pod \"dns-operator-744455d44c-2xv74\" (UID: \"5e87733e-0b51-49dc-b43c-74479aa30aa2\") " pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.319640 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.340256 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.360895 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.380810 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.390865 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.400795 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.420745 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.440100 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.460763 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.471769 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18d60a3e-2409-4136-b0f2-f46a3860cc9f-signing-key\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.479711 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.485028 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18d60a3e-2409-4136-b0f2-f46a3860cc9f-signing-cabundle\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.520671 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.539851 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.560524 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.590688 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.600863 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.619870 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.639914 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.659839 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.679706 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.699298 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.718524 4799 request.go:700] Waited for 1.00462054s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.720056 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.740094 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.760358 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.771804 4799 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.771923 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls podName:0b71b47b-d667-49c1-ae5b-3326bcde5508 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.27189107 +0000 UTC m=+150.877844172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-6ddkj" (UID: "0b71b47b-d667-49c1-ae5b-3326bcde5508") : failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.778711 4799 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.778797 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.278775069 +0000 UTC m=+150.884728191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780137 4799 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780164 4799 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780213 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-trusted-ca-bundle podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.280193512 +0000 UTC m=+150.886146624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-trusted-ca-bundle") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780258 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-encryption-config podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.280244923 +0000 UTC m=+150.886198035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-encryption-config") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780137 4799 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780263 4799 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780303 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-config podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.280291994 +0000 UTC m=+150.886245106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-config") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780325 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-serving-ca podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.280314775 +0000 UTC m=+150.886267877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-serving-ca") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780843 4799 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780849 4799 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780883 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-serving-cert podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.280871848 +0000 UTC m=+150.886824950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-serving-cert") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780938 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.280913469 +0000 UTC m=+150.886866581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780944 4799 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.780986 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.28097093 +0000 UTC m=+150.886924042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.786810 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.800953 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.820812 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.840498 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.860619 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.880128 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.900168 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.920730 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.940792 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.960274 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 20:07:52 crc kubenswrapper[4799]: I0319 20:07:52.980791 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.995296 4799 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:52 crc kubenswrapper[4799]: E0319 20:07:52.995421 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist podName:a06124f0-b975-4b73-b58d-f678af8cda26 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:53.495373556 +0000 UTC m=+151.101326658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-dzmpr" (UID: "a06124f0-b975-4b73-b58d-f678af8cda26") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.000230 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.020799 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.039657 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.061025 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.080051 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.099979 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.116337 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:07:53 crc kubenswrapper[4799]: E0319 20:07:53.116818 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.120569 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.139943 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.160641 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.179769 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.200190 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.221193 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.240984 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.260747 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.280772 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.300763 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320127 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320330 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320436 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320479 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320533 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320666 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-encryption-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.320762 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.321042 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-serving-cert\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.321276 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.321351 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-serving-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.339703 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.360332 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.380186 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.399790 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.420196 4799 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.440052 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.484866 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlqrg\" (UniqueName: \"kubernetes.io/projected/b363a50c-90bb-42cf-8129-2b5672a86812-kube-api-access-vlqrg\") pod \"apiserver-7bbb656c7d-zgsz2\" (UID: \"b363a50c-90bb-42cf-8129-2b5672a86812\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.504584 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.524208 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.525338 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.546781 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x996k\" (UniqueName: \"kubernetes.io/projected/b05f6d19-514d-4d86-8c9b-31606b794551-kube-api-access-x996k\") pod \"authentication-operator-69f744f599-942vf\" (UID: \"b05f6d19-514d-4d86-8c9b-31606b794551\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.559783 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.580182 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.600099 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.620476 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.643017 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.660336 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.681577 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.718109 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtxr\" (UniqueName: \"kubernetes.io/projected/fcc1241a-6888-4bcc-b784-474bc8865f63-kube-api-access-gjtxr\") pod \"controller-manager-879f6c89f-zxq6v\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.738763 4799 request.go:700] Waited for 1.856039471s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.740454 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsltk\" (UniqueName: \"kubernetes.io/projected/6627dc01-4b17-4951-ac65-b51d037f5215-kube-api-access-lsltk\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.744739 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2"] Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.757567 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcw79\" (UniqueName: \"kubernetes.io/projected/274f0b01-a045-405f-80e8-f278a87a97ce-kube-api-access-pcw79\") pod \"oauth-openshift-558db77b4-j5t7c\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.778429 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6h4t\" (UniqueName: \"kubernetes.io/projected/c676b337-d700-423f-9fdc-d82946d7e0c4-kube-api-access-d6h4t\") pod \"cluster-samples-operator-665b6dd947-5dnp8\" (UID: \"c676b337-d700-423f-9fdc-d82946d7e0c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.797216 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhd46\" (UniqueName: \"kubernetes.io/projected/d00dca6c-4b7a-4e1a-978b-44e97cba491a-kube-api-access-nhd46\") pod \"console-operator-58897d9998-wjwdj\" (UID: \"d00dca6c-4b7a-4e1a-978b-44e97cba491a\") " pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.798677 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.813058 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.818547 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mz8h\" (UniqueName: \"kubernetes.io/projected/31337dcf-6a02-4e29-a768-15c7077c99d7-kube-api-access-2mz8h\") pod \"catalog-operator-68c6474976-fhbnk\" (UID: \"31337dcf-6a02-4e29-a768-15c7077c99d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.820431 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.844811 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" event={"ID":"b363a50c-90bb-42cf-8129-2b5672a86812","Type":"ContainerStarted","Data":"348b1826e45252808ec046c83652248cbb3852763cc538bca34e2919c87cb9fb"} Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.844964 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2bxb\" (UniqueName: \"kubernetes.io/projected/343a3122-a4be-4c67-bef4-22cd0e482cea-kube-api-access-x2bxb\") pod \"console-f9d7485db-lq2lw\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.844972 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.876946 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd5f52c6-275b-43e3-bc17-9fd85638d9bb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxzt7\" (UID: \"fd5f52c6-275b-43e3-bc17-9fd85638d9bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.904545 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.918657 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.928137 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9tcx\" (UniqueName: \"kubernetes.io/projected/18d60a3e-2409-4136-b0f2-f46a3860cc9f-kube-api-access-p9tcx\") pod \"service-ca-9c57cc56f-hqtdl\" (UID: \"18d60a3e-2409-4136-b0f2-f46a3860cc9f\") " pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.941669 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wnm\" (UniqueName: \"kubernetes.io/projected/ae84a622-64a2-408d-9275-b3ece41df758-kube-api-access-44wnm\") pod \"router-default-5444994796-zxk4t\" (UID: \"ae84a622-64a2-408d-9275-b3ece41df758\") " pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.945840 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vfsj\" (UniqueName: \"kubernetes.io/projected/1bb0b885-60b4-4211-8c18-8d759ff37ace-kube-api-access-9vfsj\") pod \"openshift-apiserver-operator-796bbdcf4f-dcbvh\" (UID: \"1bb0b885-60b4-4211-8c18-8d759ff37ace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.959618 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6627dc01-4b17-4951-ac65-b51d037f5215-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pqjmc\" (UID: \"6627dc01-4b17-4951-ac65-b51d037f5215\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.962118 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.974277 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d29c0-1b6f-41bc-b103-016c2ee40e42-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z5fgk\" (UID: \"b28d29c0-1b6f-41bc-b103-016c2ee40e42\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.977617 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:53 crc kubenswrapper[4799]: W0319 20:07:53.992758 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae84a622_64a2_408d_9275_b3ece41df758.slice/crio-8206d95d367073a501573b45dbbc425304a8e88aa27663236103996db95ca75c WatchSource:0}: Error finding container 8206d95d367073a501573b45dbbc425304a8e88aa27663236103996db95ca75c: Status 404 returned error can't find the container with id 8206d95d367073a501573b45dbbc425304a8e88aa27663236103996db95ca75c Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.994251 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" Mar 19 20:07:53 crc kubenswrapper[4799]: I0319 20:07:53.995533 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lds6j\" (UniqueName: \"kubernetes.io/projected/75926aed-864d-42ee-aabf-89e5579606a7-kube-api-access-lds6j\") pod \"route-controller-manager-6576b87f9c-nzcwj\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.021029 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5j4\" (UniqueName: \"kubernetes.io/projected/f61ce38b-72a4-41b0-9416-7956d3cc1133-kube-api-access-6r5j4\") pod \"kube-storage-version-migrator-operator-b67b599dd-qb26c\" (UID: \"f61ce38b-72a4-41b0-9416-7956d3cc1133\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.033248 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmr8c\" (UniqueName: \"kubernetes.io/projected/5e87733e-0b51-49dc-b43c-74479aa30aa2-kube-api-access-zmr8c\") pod \"dns-operator-744455d44c-2xv74\" (UID: \"5e87733e-0b51-49dc-b43c-74479aa30aa2\") " pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.050195 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.050838 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.053994 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxm44\" (UniqueName: \"kubernetes.io/projected/e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d-kube-api-access-gxm44\") pod \"downloads-7954f5f757-49fwh\" (UID: \"e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d\") " pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.059906 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.080942 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5fd\" (UniqueName: \"kubernetes.io/projected/51120354-ef9e-442c-b1f0-2bcaf2e4d5ee-kube-api-access-mm5fd\") pod \"machine-approver-56656f9798-zblhc\" (UID: \"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.097069 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j5t7c"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.097108 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.098159 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgj9w\" (UniqueName: \"kubernetes.io/projected/e7a739cc-cb77-45dc-9811-661046ccf05b-kube-api-access-tgj9w\") pod \"marketplace-operator-79b997595-wjn4f\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.113062 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lks7v\" (UniqueName: \"kubernetes.io/projected/eb2fa492-7952-4155-9f40-bb6f9c569a4f-kube-api-access-lks7v\") pod \"openshift-config-operator-7777fb866f-wnspm\" (UID: \"eb2fa492-7952-4155-9f40-bb6f9c569a4f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.130711 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.140035 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.156724 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zxq6v"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.157653 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw46t\" (UniqueName: \"kubernetes.io/projected/c17ad003-bca3-46f4-8227-1d59764cf084-kube-api-access-kw46t\") pod \"openshift-controller-manager-operator-756b6f6bc6-p7f89\" (UID: \"c17ad003-bca3-46f4-8227-1d59764cf084\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.166217 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwbg7\" (UniqueName: \"kubernetes.io/projected/a06124f0-b975-4b73-b58d-f678af8cda26-kube-api-access-hwbg7\") pod \"cni-sysctl-allowlist-ds-dzmpr\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.166461 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.168303 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.172207 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.181058 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.182279 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-audit\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.211363 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lq2lw"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.211683 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.215273 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=10.215264702 podStartE2EDuration="10.215264702s" podCreationTimestamp="2026-03-19 20:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:54.213984292 +0000 UTC m=+151.819937364" watchObservedRunningTime="2026-03-19 20:07:54.215264702 +0000 UTC m=+151.821217774" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.221254 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.226144 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wjwdj"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.227143 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.231033 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: W0319 20:07:54.231556 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51120354_ef9e_442c_b1f0_2bcaf2e4d5ee.slice/crio-8cb06112566cd25590ba0e4589ee39a4cfee5c804503f547ded2dbe500f75048 WatchSource:0}: Error finding container 8cb06112566cd25590ba0e4589ee39a4cfee5c804503f547ded2dbe500f75048: Status 404 returned error can't find the container with id 8cb06112566cd25590ba0e4589ee39a4cfee5c804503f547ded2dbe500f75048 Mar 19 20:07:54 crc kubenswrapper[4799]: W0319 20:07:54.232812 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcc1241a_6888_4bcc_b784_474bc8865f63.slice/crio-6c983e470723989c2083ce3148a56465520f16fba7a8d7c309a4f4e03a952d3c WatchSource:0}: Error finding container 6c983e470723989c2083ce3148a56465520f16fba7a8d7c309a4f4e03a952d3c: Status 404 returned error can't find the container with id 6c983e470723989c2083ce3148a56465520f16fba7a8d7c309a4f4e03a952d3c Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.234872 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.244199 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.244500 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.247466 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-942vf"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.255296 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.260416 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.264614 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-serving-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.270104 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.277968 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.279682 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.290689 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hslzg\" (UniqueName: \"kubernetes.io/projected/0b71b47b-d667-49c1-ae5b-3326bcde5508-kube-api-access-hslzg\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.292831 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5db\" (UniqueName: \"kubernetes.io/projected/8dd97bd4-11f1-48a2-ba74-2eba33de161b-kube-api-access-wn5db\") pod \"control-plane-machine-set-operator-78cbb6b69f-drzzt\" (UID: \"8dd97bd4-11f1-48a2-ba74-2eba33de161b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.302453 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.319556 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.320286 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-serving-cert\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.321359 4799 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.321444 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.321423091 +0000 UTC m=+152.927376163 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.321567 4799 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.321632 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca podName:5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.321613925 +0000 UTC m=+152.927566997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca") pod "apiserver-76f77b778f-nthhp" (UID: "5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.321792 4799 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.321859 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls podName:0b71b47b-d667-49c1-ae5b-3326bcde5508 nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.321845761 +0000 UTC m=+152.927798943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-6ddkj" (UID: "0b71b47b-d667-49c1-ae5b-3326bcde5508") : failed to sync secret cache: timed out waiting for the condition Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.324238 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-encryption-config\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.324472 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.340988 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.365855 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.379180 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.394466 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.400410 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.421464 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.423216 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.440738 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.461547 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.464629 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.471929 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbq8\" (UniqueName: \"kubernetes.io/projected/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-kube-api-access-nhbq8\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.474110 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.496272 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wnspm"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.531833 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=4.531814934 podStartE2EDuration="4.531814934s" podCreationTimestamp="2026-03-19 20:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:54.531308912 +0000 UTC m=+152.137261984" watchObservedRunningTime="2026-03-19 20:07:54.531814934 +0000 UTC m=+152.137768006" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.547235 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" Mar 19 20:07:54 crc kubenswrapper[4799]: W0319 20:07:54.547942 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb0b885_60b4_4211_8c18_8d759ff37ace.slice/crio-6ef112808cb5fd937db079a9ba1d9a2cb5351053f33ba13ab39cefb258198863 WatchSource:0}: Error finding container 6ef112808cb5fd937db079a9ba1d9a2cb5351053f33ba13ab39cefb258198863: Status 404 returned error can't find the container with id 6ef112808cb5fd937db079a9ba1d9a2cb5351053f33ba13ab39cefb258198863 Mar 19 20:07:54 crc kubenswrapper[4799]: W0319 20:07:54.549249 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75926aed_864d_42ee_aabf_89e5579606a7.slice/crio-b0d9dcebf6dec16613056a2a655a4cc33df7f6438b9deea6689980bbb526bd9a WatchSource:0}: Error finding container b0d9dcebf6dec16613056a2a655a4cc33df7f6438b9deea6689980bbb526bd9a: Status 404 returned error can't find the container with id b0d9dcebf6dec16613056a2a655a4cc33df7f6438b9deea6689980bbb526bd9a Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550322 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-service-ca\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550363 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dh2\" (UniqueName: \"kubernetes.io/projected/fd7aca22-2318-490a-bf8c-33e72bc84a8c-kube-api-access-b4dh2\") pod \"migrator-59844c95c7-5lgmn\" (UID: \"fd7aca22-2318-490a-bf8c-33e72bc84a8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550406 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550423 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttgz6\" (UniqueName: \"kubernetes.io/projected/af651426-59b2-46f8-9e60-011b0d134c7d-kube-api-access-ttgz6\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550471 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lcf\" (UniqueName: \"kubernetes.io/projected/f8720bb6-e6ea-43b3-a750-d0b7c1221266-kube-api-access-c5lcf\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550535 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c555a523-2329-4b06-be49-cb3ffa1e5212-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550576 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8bl8\" (UniqueName: \"kubernetes.io/projected/be6d29c8-3af3-4395-accc-c87b7685791e-kube-api-access-w8bl8\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550607 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9nf2\" (UniqueName: \"kubernetes.io/projected/8ee614fe-98c9-4ae4-9233-dfe531cc54f9-kube-api-access-h9nf2\") pod \"package-server-manager-789f6589d5-f7jjj\" (UID: \"8ee614fe-98c9-4ae4-9233-dfe531cc54f9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550670 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6854j\" (UniqueName: \"kubernetes.io/projected/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-kube-api-access-6854j\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550721 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c555a523-2329-4b06-be49-cb3ffa1e5212-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550746 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a53dfbf1-613e-4465-ae93-b9a146216ec4-serving-cert\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550808 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-trusted-ca\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550825 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550850 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqrqq\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-kube-api-access-pqrqq\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550898 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-mountpoint-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550965 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trxlp\" (UniqueName: \"kubernetes.io/projected/0af95971-f650-4f4a-994e-72e58a5ba378-kube-api-access-trxlp\") pod \"multus-admission-controller-857f4d67dd-mtvrz\" (UID: \"0af95971-f650-4f4a-994e-72e58a5ba378\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550980 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-socket-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.550994 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/be6d29c8-3af3-4395-accc-c87b7685791e-srv-cert\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551011 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0af95971-f650-4f4a-994e-72e58a5ba378-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mtvrz\" (UID: \"0af95971-f650-4f4a-994e-72e58a5ba378\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551026 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d86bd00-091f-4a51-ac44-af09c78f1857-trusted-ca\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551040 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-tmpfs\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551067 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-config\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551083 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af651426-59b2-46f8-9e60-011b0d134c7d-images\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551116 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d86bd00-091f-4a51-ac44-af09c78f1857-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551160 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-bound-sa-token\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551249 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-client\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551315 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-tls\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551333 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcnp\" (UniqueName: \"kubernetes.io/projected/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-kube-api-access-cvcnp\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551348 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af651426-59b2-46f8-9e60-011b0d134c7d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.551409 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0d7527-3b92-4479-80b9-4a71e7c1054b-config\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.554360 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af651426-59b2-46f8-9e60-011b0d134c7d-proxy-tls\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.554403 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-ca\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.554517 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb976\" (UniqueName: \"kubernetes.io/projected/a53dfbf1-613e-4465-ae93-b9a146216ec4-kube-api-access-nb976\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.554536 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8720bb6-e6ea-43b3-a750-d0b7c1221266-config-volume\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.556370 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9nlf\" (UniqueName: \"kubernetes.io/projected/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-kube-api-access-b9nlf\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.556701 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-plugins-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.557522 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hqtdl"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.557661 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d86bd00-091f-4a51-ac44-af09c78f1857-metrics-tls\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558007 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558143 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-certificates\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558166 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8720bb6-e6ea-43b3-a750-d0b7c1221266-secret-volume\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558302 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/be6d29c8-3af3-4395-accc-c87b7685791e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558332 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-csi-data-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558350 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558586 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-registration-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558614 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-webhook-cert\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558885 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-metrics-tls\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.558964 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9xz\" (UniqueName: \"kubernetes.io/projected/0d0d7527-3b92-4479-80b9-4a71e7c1054b-kube-api-access-ln9xz\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.559028 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-config-volume\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.559084 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d0d7527-3b92-4479-80b9-4a71e7c1054b-serving-cert\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.559247 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.559270 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rntt\" (UniqueName: \"kubernetes.io/projected/38afdf16-38e4-47b5-b3d7-aa040962429d-kube-api-access-8rntt\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.559610 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.059596608 +0000 UTC m=+152.665549680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.559882 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ee614fe-98c9-4ae4-9233-dfe531cc54f9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f7jjj\" (UID: \"8ee614fe-98c9-4ae4-9233-dfe531cc54f9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.559975 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2qv\" (UniqueName: \"kubernetes.io/projected/8d86bd00-091f-4a51-ac44-af09c78f1857-kube-api-access-rs2qv\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.560162 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c555a523-2329-4b06-be49-cb3ffa1e5212-config\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.560288 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-proxy-tls\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.568492 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2xv74"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662047 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.662227 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.162202384 +0000 UTC m=+152.768155466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662497 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb976\" (UniqueName: \"kubernetes.io/projected/a53dfbf1-613e-4465-ae93-b9a146216ec4-kube-api-access-nb976\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662525 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8720bb6-e6ea-43b3-a750-d0b7c1221266-config-volume\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662636 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/46dfa0a6-ed23-4430-84b6-f0652be25261-certs\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662664 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9nlf\" (UniqueName: \"kubernetes.io/projected/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-kube-api-access-b9nlf\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662686 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/46dfa0a6-ed23-4430-84b6-f0652be25261-node-bootstrap-token\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662798 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-plugins-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662819 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d86bd00-091f-4a51-ac44-af09c78f1857-metrics-tls\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662837 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662942 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8720bb6-e6ea-43b3-a750-d0b7c1221266-secret-volume\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662963 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-certificates\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662978 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-csi-data-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.662994 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663101 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/be6d29c8-3af3-4395-accc-c87b7685791e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663120 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-registration-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663136 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-webhook-cert\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663153 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbghb\" (UniqueName: \"kubernetes.io/projected/f0dce6fd-e631-404f-8864-6c7f0fcbd52f-kube-api-access-pbghb\") pod \"ingress-canary-9rsf6\" (UID: \"f0dce6fd-e631-404f-8864-6c7f0fcbd52f\") " pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663281 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-metrics-tls\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663297 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9xz\" (UniqueName: \"kubernetes.io/projected/0d0d7527-3b92-4479-80b9-4a71e7c1054b-kube-api-access-ln9xz\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663400 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-config-volume\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663417 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d0d7527-3b92-4479-80b9-4a71e7c1054b-serving-cert\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663437 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663458 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rntt\" (UniqueName: \"kubernetes.io/projected/38afdf16-38e4-47b5-b3d7-aa040962429d-kube-api-access-8rntt\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663578 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2qv\" (UniqueName: \"kubernetes.io/projected/8d86bd00-091f-4a51-ac44-af09c78f1857-kube-api-access-rs2qv\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663594 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ee614fe-98c9-4ae4-9233-dfe531cc54f9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f7jjj\" (UID: \"8ee614fe-98c9-4ae4-9233-dfe531cc54f9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663613 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c555a523-2329-4b06-be49-cb3ffa1e5212-config\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663724 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-proxy-tls\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663749 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-service-ca\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663765 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9gq\" (UniqueName: \"kubernetes.io/projected/46dfa0a6-ed23-4430-84b6-f0652be25261-kube-api-access-4s9gq\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663817 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wjn4f"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663877 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dh2\" (UniqueName: \"kubernetes.io/projected/fd7aca22-2318-490a-bf8c-33e72bc84a8c-kube-api-access-b4dh2\") pod \"migrator-59844c95c7-5lgmn\" (UID: \"fd7aca22-2318-490a-bf8c-33e72bc84a8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663897 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663919 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-csi-data-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663929 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttgz6\" (UniqueName: \"kubernetes.io/projected/af651426-59b2-46f8-9e60-011b0d134c7d-kube-api-access-ttgz6\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664112 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lcf\" (UniqueName: \"kubernetes.io/projected/f8720bb6-e6ea-43b3-a750-d0b7c1221266-kube-api-access-c5lcf\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664135 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c555a523-2329-4b06-be49-cb3ffa1e5212-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664155 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8bl8\" (UniqueName: \"kubernetes.io/projected/be6d29c8-3af3-4395-accc-c87b7685791e-kube-api-access-w8bl8\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664171 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9nf2\" (UniqueName: \"kubernetes.io/projected/8ee614fe-98c9-4ae4-9233-dfe531cc54f9-kube-api-access-h9nf2\") pod \"package-server-manager-789f6589d5-f7jjj\" (UID: \"8ee614fe-98c9-4ae4-9233-dfe531cc54f9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664298 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6854j\" (UniqueName: \"kubernetes.io/projected/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-kube-api-access-6854j\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664335 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c555a523-2329-4b06-be49-cb3ffa1e5212-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664372 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-plugins-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664459 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a53dfbf1-613e-4465-ae93-b9a146216ec4-serving-cert\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664483 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0dce6fd-e631-404f-8864-6c7f0fcbd52f-cert\") pod \"ingress-canary-9rsf6\" (UID: \"f0dce6fd-e631-404f-8864-6c7f0fcbd52f\") " pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664612 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-trusted-ca\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664635 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664652 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqrqq\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-kube-api-access-pqrqq\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664770 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-mountpoint-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664807 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-socket-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664824 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/be6d29c8-3af3-4395-accc-c87b7685791e-srv-cert\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664937 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trxlp\" (UniqueName: \"kubernetes.io/projected/0af95971-f650-4f4a-994e-72e58a5ba378-kube-api-access-trxlp\") pod \"multus-admission-controller-857f4d67dd-mtvrz\" (UID: \"0af95971-f650-4f4a-994e-72e58a5ba378\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664954 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0af95971-f650-4f4a-994e-72e58a5ba378-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mtvrz\" (UID: \"0af95971-f650-4f4a-994e-72e58a5ba378\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664969 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-tmpfs\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.664984 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-config\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665102 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d86bd00-091f-4a51-ac44-af09c78f1857-trusted-ca\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665121 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af651426-59b2-46f8-9e60-011b0d134c7d-images\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665238 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665145 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d86bd00-091f-4a51-ac44-af09c78f1857-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665279 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-bound-sa-token\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665309 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-client\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665480 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-tls\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665504 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcnp\" (UniqueName: \"kubernetes.io/projected/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-kube-api-access-cvcnp\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665519 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0d7527-3b92-4479-80b9-4a71e7c1054b-config\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665630 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af651426-59b2-46f8-9e60-011b0d134c7d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665661 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af651426-59b2-46f8-9e60-011b0d134c7d-proxy-tls\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665676 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-ca\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.665821 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-socket-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.666432 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-certificates\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.666602 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.666884 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-registration-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.667659 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-ca\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.667667 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/38afdf16-38e4-47b5-b3d7-aa040962429d-mountpoint-dir\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.670620 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.170601729 +0000 UTC m=+152.776554911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.672538 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d0d7527-3b92-4479-80b9-4a71e7c1054b-serving-cert\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.672881 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0d7527-3b92-4479-80b9-4a71e7c1054b-config\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.672883 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d86bd00-091f-4a51-ac44-af09c78f1857-trusted-ca\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.673837 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-config-volume\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.674202 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-trusted-ca\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.675200 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c555a523-2329-4b06-be49-cb3ffa1e5212-config\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.675544 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af651426-59b2-46f8-9e60-011b0d134c7d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.675573 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af651426-59b2-46f8-9e60-011b0d134c7d-images\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.675794 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-config\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.663761 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8720bb6-e6ea-43b3-a750-d0b7c1221266-config-volume\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.677513 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0af95971-f650-4f4a-994e-72e58a5ba378-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mtvrz\" (UID: \"0af95971-f650-4f4a-994e-72e58a5ba378\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.677627 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c555a523-2329-4b06-be49-cb3ffa1e5212-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.677642 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/be6d29c8-3af3-4395-accc-c87b7685791e-srv-cert\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.677935 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d86bd00-091f-4a51-ac44-af09c78f1857-metrics-tls\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.678116 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ee614fe-98c9-4ae4-9233-dfe531cc54f9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f7jjj\" (UID: \"8ee614fe-98c9-4ae4-9233-dfe531cc54f9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.678369 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.678436 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8720bb6-e6ea-43b3-a750-d0b7c1221266-secret-volume\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.680358 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/be6d29c8-3af3-4395-accc-c87b7685791e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.680456 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-tmpfs\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.680872 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-service-ca\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.686765 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-proxy-tls\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.689689 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a53dfbf1-613e-4465-ae93-b9a146216ec4-serving-cert\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.691044 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-webhook-cert\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.691099 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-metrics-tls\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.696112 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.697324 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a53dfbf1-613e-4465-ae93-b9a146216ec4-etcd-client\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.697857 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-tls\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.698269 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af651426-59b2-46f8-9e60-011b0d134c7d-proxy-tls\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.717495 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb976\" (UniqueName: \"kubernetes.io/projected/a53dfbf1-613e-4465-ae93-b9a146216ec4-kube-api-access-nb976\") pod \"etcd-operator-b45778765-x5czz\" (UID: \"a53dfbf1-613e-4465-ae93-b9a146216ec4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:54 crc kubenswrapper[4799]: W0319 20:07:54.718702 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7a739cc_cb77_45dc_9811_661046ccf05b.slice/crio-6ee473103d264e9b3b762a626c12fd6bfc7ca00a408944ae24edffc3e21f832a WatchSource:0}: Error finding container 6ee473103d264e9b3b762a626c12fd6bfc7ca00a408944ae24edffc3e21f832a: Status 404 returned error can't find the container with id 6ee473103d264e9b3b762a626c12fd6bfc7ca00a408944ae24edffc3e21f832a Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.733899 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9nlf\" (UniqueName: \"kubernetes.io/projected/e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84-kube-api-access-b9nlf\") pod \"machine-config-controller-84d6567774-6mk7s\" (UID: \"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.742929 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.751762 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.757248 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttgz6\" (UniqueName: \"kubernetes.io/projected/af651426-59b2-46f8-9e60-011b0d134c7d-kube-api-access-ttgz6\") pod \"machine-config-operator-74547568cd-9pr5k\" (UID: \"af651426-59b2-46f8-9e60-011b0d134c7d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.766950 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.767194 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9gq\" (UniqueName: \"kubernetes.io/projected/46dfa0a6-ed23-4430-84b6-f0652be25261-kube-api-access-4s9gq\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.767283 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0dce6fd-e631-404f-8864-6c7f0fcbd52f-cert\") pod \"ingress-canary-9rsf6\" (UID: \"f0dce6fd-e631-404f-8864-6c7f0fcbd52f\") " pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.767383 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/46dfa0a6-ed23-4430-84b6-f0652be25261-certs\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.767417 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/46dfa0a6-ed23-4430-84b6-f0652be25261-node-bootstrap-token\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.767441 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbghb\" (UniqueName: \"kubernetes.io/projected/f0dce6fd-e631-404f-8864-6c7f0fcbd52f-kube-api-access-pbghb\") pod \"ingress-canary-9rsf6\" (UID: \"f0dce6fd-e631-404f-8864-6c7f0fcbd52f\") " pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.767645 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.267630436 +0000 UTC m=+152.873583508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.783428 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0dce6fd-e631-404f-8864-6c7f0fcbd52f-cert\") pod \"ingress-canary-9rsf6\" (UID: \"f0dce6fd-e631-404f-8864-6c7f0fcbd52f\") " pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.783807 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/46dfa0a6-ed23-4430-84b6-f0652be25261-certs\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.789482 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/46dfa0a6-ed23-4430-84b6-f0652be25261-node-bootstrap-token\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.789930 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lcf\" (UniqueName: \"kubernetes.io/projected/f8720bb6-e6ea-43b3-a750-d0b7c1221266-kube-api-access-c5lcf\") pod \"collect-profiles-29565840-27mqk\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.800780 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.809693 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.817959 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-49fwh"] Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.818881 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8bl8\" (UniqueName: \"kubernetes.io/projected/be6d29c8-3af3-4395-accc-c87b7685791e-kube-api-access-w8bl8\") pod \"olm-operator-6b444d44fb-475c8\" (UID: \"be6d29c8-3af3-4395-accc-c87b7685791e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.820426 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9nf2\" (UniqueName: \"kubernetes.io/projected/8ee614fe-98c9-4ae4-9233-dfe531cc54f9-kube-api-access-h9nf2\") pod \"package-server-manager-789f6589d5-f7jjj\" (UID: \"8ee614fe-98c9-4ae4-9233-dfe531cc54f9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.860819 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c555a523-2329-4b06-be49-cb3ffa1e5212-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9h5w\" (UID: \"c555a523-2329-4b06-be49-cb3ffa1e5212\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.870251 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: E0319 20:07:54.870776 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.370758445 +0000 UTC m=+152.976711517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.889798 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" event={"ID":"fcc1241a-6888-4bcc-b784-474bc8865f63","Type":"ContainerStarted","Data":"851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.889852 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" event={"ID":"fcc1241a-6888-4bcc-b784-474bc8865f63","Type":"ContainerStarted","Data":"6c983e470723989c2083ce3148a56465520f16fba7a8d7c309a4f4e03a952d3c"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.890998 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.910585 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqrqq\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-kube-api-access-pqrqq\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.910632 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6854j\" (UniqueName: \"kubernetes.io/projected/01bc1ca8-80c6-4b25-9cd5-ff33929f35d1-kube-api-access-6854j\") pod \"packageserver-d55dfcdfc-rrsbl\" (UID: \"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.911550 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rntt\" (UniqueName: \"kubernetes.io/projected/38afdf16-38e4-47b5-b3d7-aa040962429d-kube-api-access-8rntt\") pod \"csi-hostpathplugin-ttwrh\" (UID: \"38afdf16-38e4-47b5-b3d7-aa040962429d\") " pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.915581 4799 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zxq6v container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.915624 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.922546 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zxk4t" event={"ID":"ae84a622-64a2-408d-9275-b3ece41df758","Type":"ContainerStarted","Data":"9cf783b4ee06fb005ed642325adf261f905ba17f8bba39d8abc83aa2b702f23b"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.924078 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zxk4t" event={"ID":"ae84a622-64a2-408d-9275-b3ece41df758","Type":"ContainerStarted","Data":"8206d95d367073a501573b45dbbc425304a8e88aa27663236103996db95ca75c"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.926555 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" event={"ID":"b05f6d19-514d-4d86-8c9b-31606b794551","Type":"ContainerStarted","Data":"d8eedc0f618f406bb52bc493ac5a42d4cc68a411550208de8874eb088a0c37a8"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.926581 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" event={"ID":"b05f6d19-514d-4d86-8c9b-31606b794551","Type":"ContainerStarted","Data":"682763c49c79b478123ecc6010b8bef495ef7d7772a464ae6eda7efdfdba0fb8"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.927155 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9xz\" (UniqueName: \"kubernetes.io/projected/0d0d7527-3b92-4479-80b9-4a71e7c1054b-kube-api-access-ln9xz\") pod \"service-ca-operator-777779d784-vpwsz\" (UID: \"0d0d7527-3b92-4479-80b9-4a71e7c1054b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.934247 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" event={"ID":"fd5f52c6-275b-43e3-bc17-9fd85638d9bb","Type":"ContainerStarted","Data":"e024b08fee16fb9446c0790ed7f95cf6512d83af70f4384982f62eae6d313cae"} Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.958584 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-bound-sa-token\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.965547 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.966412 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trxlp\" (UniqueName: \"kubernetes.io/projected/0af95971-f650-4f4a-994e-72e58a5ba378-kube-api-access-trxlp\") pod \"multus-admission-controller-857f4d67dd-mtvrz\" (UID: \"0af95971-f650-4f4a-994e-72e58a5ba378\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:54 crc kubenswrapper[4799]: I0319 20:07:54.966750 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.971143 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:54.972560 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.472518292 +0000 UTC m=+153.078471364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.973010 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:07:55 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:07:55 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:07:55 crc kubenswrapper[4799]: healthz check failed Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.973058 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.978492 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2qv\" (UniqueName: \"kubernetes.io/projected/8d86bd00-091f-4a51-ac44-af09c78f1857-kube-api-access-rs2qv\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.986673 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.992306 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" event={"ID":"d00dca6c-4b7a-4e1a-978b-44e97cba491a","Type":"ContainerStarted","Data":"b33ffbed38b6c7ce9fa63874320e55041c036b18af407714a1e771dba44284b5"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.992357 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" event={"ID":"d00dca6c-4b7a-4e1a-978b-44e97cba491a","Type":"ContainerStarted","Data":"a6b0786f9089273bea8043bcb648b6a68a1353a89fb1eb6a4a10a829c3eaa4a0"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.992808 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.993482 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.995063 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lq2lw" event={"ID":"343a3122-a4be-4c67-bef4-22cd0e482cea","Type":"ContainerStarted","Data":"7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:54.995093 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lq2lw" event={"ID":"343a3122-a4be-4c67-bef4-22cd0e482cea","Type":"ContainerStarted","Data":"165cff98bbda3f03191827287e0a4d93856840ad6c9c246a0c991af64fc902f8"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.003141 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.003850 4799 generic.go:334] "Generic (PLEG): container finished" podID="b363a50c-90bb-42cf-8129-2b5672a86812" containerID="f747ab95bc04cf56cac3d75dc48b50d9df05567d67e135ece0f23084834db370" exitCode=0 Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.003899 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" event={"ID":"b363a50c-90bb-42cf-8129-2b5672a86812","Type":"ContainerDied","Data":"f747ab95bc04cf56cac3d75dc48b50d9df05567d67e135ece0f23084834db370"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.007354 4799 patch_prober.go:28] interesting pod/console-operator-58897d9998-wjwdj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.007410 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" podUID="d00dca6c-4b7a-4e1a-978b-44e97cba491a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.007972 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d86bd00-091f-4a51-ac44-af09c78f1857-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fkg75\" (UID: \"8d86bd00-091f-4a51-ac44-af09c78f1857\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.011635 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" event={"ID":"5e87733e-0b51-49dc-b43c-74479aa30aa2","Type":"ContainerStarted","Data":"3b546e8b17e5889312228ff5096ec81556c9f63baf793d039cc66bf4daa035ff"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.016379 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" event={"ID":"f61ce38b-72a4-41b0-9416-7956d3cc1133","Type":"ContainerStarted","Data":"29b70902477c28337f6054cb63a013fd276b446a0aa26903864ac3f9eed1d192"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.016557 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.018962 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dh2\" (UniqueName: \"kubernetes.io/projected/fd7aca22-2318-490a-bf8c-33e72bc84a8c-kube-api-access-b4dh2\") pod \"migrator-59844c95c7-5lgmn\" (UID: \"fd7aca22-2318-490a-bf8c-33e72bc84a8c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.027012 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" event={"ID":"31337dcf-6a02-4e29-a768-15c7077c99d7","Type":"ContainerStarted","Data":"3a899f8c789a65dbd7d961468ccb955a1142104c26fa3efe27a7a705874229f1"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.027470 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.029409 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.032218 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" event={"ID":"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee","Type":"ContainerStarted","Data":"a95e598b7b607b7629d15928687b5501a2465bf87d185e68c611d72cd782ebdc"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.032279 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" event={"ID":"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee","Type":"ContainerStarted","Data":"8cb06112566cd25590ba0e4589ee39a4cfee5c804503f547ded2dbe500f75048"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.039077 4799 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fhbnk container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.039126 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" podUID="31337dcf-6a02-4e29-a768-15c7077c99d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.039353 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.041175 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" event={"ID":"18d60a3e-2409-4136-b0f2-f46a3860cc9f","Type":"ContainerStarted","Data":"461472fa274e754d1d09ce7776f8c41c90aa03710c9363ef01d592476c43ced9"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.042712 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" event={"ID":"a06124f0-b975-4b73-b58d-f678af8cda26","Type":"ContainerStarted","Data":"8ad8a07d013a58e12b0bf71761d83b6e505fca0effc26849212fa0044c32cad5"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.043972 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.053639 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.061075 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcnp\" (UniqueName: \"kubernetes.io/projected/f3ac81e4-55c6-479f-85c7-bc28c56aa3a2-kube-api-access-cvcnp\") pod \"dns-default-gksq8\" (UID: \"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2\") " pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.063538 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.065908 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" event={"ID":"c676b337-d700-423f-9fdc-d82946d7e0c4","Type":"ContainerStarted","Data":"8f57382a15c20e782c884a3cdb6b4ad0a41622218938b67bfeb38b4229a4de40"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.066351 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" event={"ID":"c676b337-d700-423f-9fdc-d82946d7e0c4","Type":"ContainerStarted","Data":"7c257599eb627f6a51cf8561305a11115dc0412c6254debbd334b7c40a01a9a3"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.069846 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" event={"ID":"c17ad003-bca3-46f4-8227-1d59764cf084","Type":"ContainerStarted","Data":"6d4c1688a16060832cfe0d41a2c4f97e4a0bdcaa062bfd53306719211fa29efc"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.074209 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.075422 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.575405525 +0000 UTC m=+153.181358597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.077298 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" event={"ID":"6627dc01-4b17-4951-ac65-b51d037f5215","Type":"ContainerStarted","Data":"daa63a55ddc23ee870f23b7066c7e2bed762bb8d95fd32c6f19cfd210412a833"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.077336 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" event={"ID":"6627dc01-4b17-4951-ac65-b51d037f5215","Type":"ContainerStarted","Data":"cd0f0743230ed2c68341011682b409e3dcf4ea730eb4a94d6e687638e92f5900"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.078739 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.080172 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" event={"ID":"274f0b01-a045-405f-80e8-f278a87a97ce","Type":"ContainerStarted","Data":"a0d059282d766b14e230b18f4a9dfb246a22fd8b2ecf2ae602ecaa0332ee3b55"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.080239 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" event={"ID":"274f0b01-a045-405f-80e8-f278a87a97ce","Type":"ContainerStarted","Data":"8f5f8fc4669ded38bd94f9eb9a681c6174c92d6494233e45944795ce11d1f905"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.080272 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.083951 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbghb\" (UniqueName: \"kubernetes.io/projected/f0dce6fd-e631-404f-8864-6c7f0fcbd52f-kube-api-access-pbghb\") pod \"ingress-canary-9rsf6\" (UID: \"f0dce6fd-e631-404f-8864-6c7f0fcbd52f\") " pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.085080 4799 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-j5t7c container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.085205 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" podUID="274f0b01-a045-405f-80e8-f278a87a97ce" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.085380 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" event={"ID":"1bb0b885-60b4-4211-8c18-8d759ff37ace","Type":"ContainerStarted","Data":"6ef112808cb5fd937db079a9ba1d9a2cb5351053f33ba13ab39cefb258198863"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.102030 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" event={"ID":"eb2fa492-7952-4155-9f40-bb6f9c569a4f","Type":"ContainerStarted","Data":"dc91c2cfcafbe1bbeda8d40669bdf09af70cc5e3f0ac0eace98687cd97dcce4b"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.107910 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" event={"ID":"75926aed-864d-42ee-aabf-89e5579606a7","Type":"ContainerStarted","Data":"b0d9dcebf6dec16613056a2a655a4cc33df7f6438b9deea6689980bbb526bd9a"} Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.114147 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9gq\" (UniqueName: \"kubernetes.io/projected/46dfa0a6-ed23-4430-84b6-f0652be25261-kube-api-access-4s9gq\") pod \"machine-config-server-lcd4j\" (UID: \"46dfa0a6-ed23-4430-84b6-f0652be25261\") " pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.115285 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" event={"ID":"e7a739cc-cb77-45dc-9811-661046ccf05b","Type":"ContainerStarted","Data":"6ee473103d264e9b3b762a626c12fd6bfc7ca00a408944ae24edffc3e21f832a"} Mar 19 20:07:55 crc kubenswrapper[4799]: W0319 20:07:55.176015 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd97bd4_11f1_48a2_ba74_2eba33de161b.slice/crio-58dd2a67949e784323325c59dcd31bcf2d6460a528f041f3169aec5b0068584c WatchSource:0}: Error finding container 58dd2a67949e784323325c59dcd31bcf2d6460a528f041f3169aec5b0068584c: Status 404 returned error can't find the container with id 58dd2a67949e784323325c59dcd31bcf2d6460a528f041f3169aec5b0068584c Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.179754 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.183227 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.683203272 +0000 UTC m=+153.289156344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.256014 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.279402 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.282088 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.282551 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.782439491 +0000 UTC m=+153.388392623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: W0319 20:07:55.298867 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c04f3d_48cb_4f6c_b3d5_b7f3fa554a84.slice/crio-1e2fc11fd7e81ed8a5a57e7668d65866bfd9e58320d8031937233d1d76fdd91b WatchSource:0}: Error finding container 1e2fc11fd7e81ed8a5a57e7668d65866bfd9e58320d8031937233d1d76fdd91b: Status 404 returned error can't find the container with id 1e2fc11fd7e81ed8a5a57e7668d65866bfd9e58320d8031937233d1d76fdd91b Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.310350 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.359060 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.382839 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.383146 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.383241 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.383282 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.387324 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-image-import-ca\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.387629 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.887593967 +0000 UTC m=+153.493547089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.387906 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9rsf6" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.388937 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lcd4j" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.391859 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7-etcd-client\") pod \"apiserver-76f77b778f-nthhp\" (UID: \"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7\") " pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.393988 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b71b47b-d667-49c1-ae5b-3326bcde5508-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6ddkj\" (UID: \"0b71b47b-d667-49c1-ae5b-3326bcde5508\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.485535 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.487135 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:55.987097182 +0000 UTC m=+153.593050254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.495636 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.530294 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.584816 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.585278 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mtvrz"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.587424 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.587529 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.087505597 +0000 UTC m=+153.693458669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.587592 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.588429 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.088415538 +0000 UTC m=+153.694368610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: W0319 20:07:55.628716 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf651426_59b2_46f8_9e60_011b0d134c7d.slice/crio-a3d98ae7b017ae4396a64b4a4e96deb55991d4140de0e4d2003f2ee4bfbcdd29 WatchSource:0}: Error finding container a3d98ae7b017ae4396a64b4a4e96deb55991d4140de0e4d2003f2ee4bfbcdd29: Status 404 returned error can't find the container with id a3d98ae7b017ae4396a64b4a4e96deb55991d4140de0e4d2003f2ee4bfbcdd29 Mar 19 20:07:55 crc kubenswrapper[4799]: W0319 20:07:55.687043 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af95971_f650_4f4a_994e_72e58a5ba378.slice/crio-c8942ad31329d5ccaa7bcd8d936dde77c36c8b6a4f11e96e4beb77ea8280afa2 WatchSource:0}: Error finding container c8942ad31329d5ccaa7bcd8d936dde77c36c8b6a4f11e96e4beb77ea8280afa2: Status 404 returned error can't find the container with id c8942ad31329d5ccaa7bcd8d936dde77c36c8b6a4f11e96e4beb77ea8280afa2 Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.688578 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.688909 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.188895256 +0000 UTC m=+153.794848328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: W0319 20:07:55.696465 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46dfa0a6_ed23_4430_84b6_f0652be25261.slice/crio-7f93325be6a2346f41f92bc259c6ced725d2a6cc84c5a6a8febba23a9e1983cc WatchSource:0}: Error finding container 7f93325be6a2346f41f92bc259c6ced725d2a6cc84c5a6a8febba23a9e1983cc: Status 404 returned error can't find the container with id 7f93325be6a2346f41f92bc259c6ced725d2a6cc84c5a6a8febba23a9e1983cc Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.789988 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.790641 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.290629752 +0000 UTC m=+153.896582824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.898240 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:55 crc kubenswrapper[4799]: E0319 20:07:55.898811 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.398794918 +0000 UTC m=+154.004747990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.910152 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-x5czz"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.912055 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.917482 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.944191 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.970798 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk"] Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.974416 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:07:55 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:07:55 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:07:55 crc kubenswrapper[4799]: healthz check failed Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.974460 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:07:55 crc kubenswrapper[4799]: W0319 20:07:55.975610 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda53dfbf1_613e_4465_ae93_b9a146216ec4.slice/crio-7c62a1840420d90c5a9109adf8e18540d094ee87f87919c9936abfb8282937a6 WatchSource:0}: Error finding container 7c62a1840420d90c5a9109adf8e18540d094ee87f87919c9936abfb8282937a6: Status 404 returned error can't find the container with id 7c62a1840420d90c5a9109adf8e18540d094ee87f87919c9936abfb8282937a6 Mar 19 20:07:55 crc kubenswrapper[4799]: I0319 20:07:55.999436 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.000657 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.500642227 +0000 UTC m=+154.106595299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.063319 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.064740 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.095580 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ttwrh"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.101661 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.101866 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.601833361 +0000 UTC m=+154.207786433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.102101 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.102625 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.602616429 +0000 UTC m=+154.208569501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.144959 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" event={"ID":"af651426-59b2-46f8-9e60-011b0d134c7d","Type":"ContainerStarted","Data":"1a669534684bb3906c8252035d2c772da83ba083db22a6e2eb12856eab7fb9eb"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.145570 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" event={"ID":"af651426-59b2-46f8-9e60-011b0d134c7d","Type":"ContainerStarted","Data":"a3d98ae7b017ae4396a64b4a4e96deb55991d4140de0e4d2003f2ee4bfbcdd29"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.156530 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" event={"ID":"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84","Type":"ContainerStarted","Data":"1710551067cdf538375895f67bbb5d1e22a96ca5aedf4a264877a6d21c09cda4"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.156585 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" event={"ID":"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84","Type":"ContainerStarted","Data":"1e2fc11fd7e81ed8a5a57e7668d65866bfd9e58320d8031937233d1d76fdd91b"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.169519 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" event={"ID":"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1","Type":"ContainerStarted","Data":"3009061eb89a2f76e62f653d06f2a8bbe188809b24e4d3bb6b31ae57a8d5935e"} Mar 19 20:07:56 crc kubenswrapper[4799]: W0319 20:07:56.202623 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee614fe_98c9_4ae4_9233_dfe531cc54f9.slice/crio-8346b87b4e9f45b98807087237e12e6a3cc1686e0baee58b95da60629a1b89ea WatchSource:0}: Error finding container 8346b87b4e9f45b98807087237e12e6a3cc1686e0baee58b95da60629a1b89ea: Status 404 returned error can't find the container with id 8346b87b4e9f45b98807087237e12e6a3cc1686e0baee58b95da60629a1b89ea Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.204144 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.204783 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.704769745 +0000 UTC m=+154.310722817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.215238 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" event={"ID":"51120354-ef9e-442c-b1f0-2bcaf2e4d5ee","Type":"ContainerStarted","Data":"26c9909aa4fee9e39cb1ce01db843f3c8911640340223f7180b2d4618c26b5b0"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.240039 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nthhp"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.245178 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" event={"ID":"a06124f0-b975-4b73-b58d-f678af8cda26","Type":"ContainerStarted","Data":"fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.245812 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.281495 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6ddkj"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.283187 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.290595 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" event={"ID":"b363a50c-90bb-42cf-8129-2b5672a86812","Type":"ContainerStarted","Data":"bd6e2588cea0d4a8ad036f82947a9e396aa9d445adae8cb17e15f7183ba6db30"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.294445 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" event={"ID":"0d0d7527-3b92-4479-80b9-4a71e7c1054b","Type":"ContainerStarted","Data":"abe4be5364ea8daf6ade18d203e11f450f424a140e574e2f493664414de59d69"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.295326 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" event={"ID":"c555a523-2329-4b06-be49-cb3ffa1e5212","Type":"ContainerStarted","Data":"3dd3b43961512b23cd21c5e5bb3f8d00f63d2eef039f71ee4b048be315303d6a"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.298455 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" event={"ID":"8dd97bd4-11f1-48a2-ba74-2eba33de161b","Type":"ContainerStarted","Data":"14980611bec8e420ee7fe7d7fd16d8fa20c35751fac5071d75630115633f9cbf"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.298525 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" event={"ID":"8dd97bd4-11f1-48a2-ba74-2eba33de161b","Type":"ContainerStarted","Data":"58dd2a67949e784323325c59dcd31bcf2d6460a528f041f3169aec5b0068584c"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.309887 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.310190 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.810177717 +0000 UTC m=+154.416130789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.316677 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" event={"ID":"f8720bb6-e6ea-43b3-a750-d0b7c1221266","Type":"ContainerStarted","Data":"0e8e2f54ab2b3db9a69b7da0dd50c59699a03c28543c57776c4663f630dca0ff"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.322455 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" event={"ID":"18d60a3e-2409-4136-b0f2-f46a3860cc9f","Type":"ContainerStarted","Data":"93c06ee74365bc57fcb4509cad09d3281459f646cdda816f6964ad634b711733"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.328339 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.330272 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" event={"ID":"be6d29c8-3af3-4395-accc-c87b7685791e","Type":"ContainerStarted","Data":"4a1be9369a88c518047de8098d4147dd579dcc6ae28109d4a728a6ea6663bae2"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.337146 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" event={"ID":"e7a739cc-cb77-45dc-9811-661046ccf05b","Type":"ContainerStarted","Data":"cd3deca1b9f8858ef86103f41d40b0d37ff03a0f5444ba684060abe85cac4b48"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.342858 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.347774 4799 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wjn4f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/healthz\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.347832 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.15:8080/healthz\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.348881 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" event={"ID":"c17ad003-bca3-46f4-8227-1d59764cf084","Type":"ContainerStarted","Data":"6e2e741ebfb1f598dbc83e37227e9ec8f9b6193a23e4bed4a9758a6339156d44"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.356369 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" event={"ID":"31337dcf-6a02-4e29-a768-15c7077c99d7","Type":"ContainerStarted","Data":"83fa9d8c243935fc5d100484ddc8d7f34b06888ff0e09800dba6c7e0d1f96d07"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.364001 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" event={"ID":"a53dfbf1-613e-4465-ae93-b9a146216ec4","Type":"ContainerStarted","Data":"7c62a1840420d90c5a9109adf8e18540d094ee87f87919c9936abfb8282937a6"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.368251 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" event={"ID":"f61ce38b-72a4-41b0-9416-7956d3cc1133","Type":"ContainerStarted","Data":"2cbca3dc16ff4083f0cccbb2885d1adc6329a23c71a1c08a4d2a19eb78d95a91"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.372280 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.373248 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gksq8"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.379274 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" event={"ID":"b28d29c0-1b6f-41bc-b103-016c2ee40e42","Type":"ContainerStarted","Data":"5b2ec633178bbe049e21ee239a5c30655e4838e9e879581dc8c53497d4fb1806"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.379318 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" event={"ID":"b28d29c0-1b6f-41bc-b103-016c2ee40e42","Type":"ContainerStarted","Data":"1f234fd6ed2255c936faf4ef9685d099a1dc1c69405557d674906228403d5aa0"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.380265 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.393338 4799 generic.go:334] "Generic (PLEG): container finished" podID="eb2fa492-7952-4155-9f40-bb6f9c569a4f" containerID="ae77b67bf69cc5828a0f2ee159642b22dd0e4ff376a8fe05734e94c75602b7f3" exitCode=0 Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.393416 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" event={"ID":"eb2fa492-7952-4155-9f40-bb6f9c569a4f","Type":"ContainerDied","Data":"ae77b67bf69cc5828a0f2ee159642b22dd0e4ff376a8fe05734e94c75602b7f3"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.397097 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" event={"ID":"5e87733e-0b51-49dc-b43c-74479aa30aa2","Type":"ContainerStarted","Data":"559213d0666891a7988397371b0bf73437274054db7d3a8929868711cca47d87"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.406599 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" event={"ID":"fd5f52c6-275b-43e3-bc17-9fd85638d9bb","Type":"ContainerStarted","Data":"e22a1dca373a9e6519f09b7af04bffae4c9038c782865777355533c7f58e3044"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.409192 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9rsf6"] Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.409375 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" event={"ID":"1bb0b885-60b4-4211-8c18-8d759ff37ace","Type":"ContainerStarted","Data":"9b31f6d2db6dbc16064d915a1c289099c1d8caac7b84f3fa80f595410fd1cec4"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.410364 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.410932 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:56.91091584 +0000 UTC m=+154.516868912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.507421 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" event={"ID":"75926aed-864d-42ee-aabf-89e5579606a7","Type":"ContainerStarted","Data":"28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.508605 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.510864 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lcd4j" event={"ID":"46dfa0a6-ed23-4430-84b6-f0652be25261","Type":"ContainerStarted","Data":"7f93325be6a2346f41f92bc259c6ced725d2a6cc84c5a6a8febba23a9e1983cc"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.512591 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.512886 4799 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-nzcwj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.512915 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" podUID="75926aed-864d-42ee-aabf-89e5579606a7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.515705 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-49fwh" event={"ID":"e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d","Type":"ContainerStarted","Data":"7e513506a51911c42ef7ce5679442a5d0327015378c3433ae01e0a4a354edd21"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.515750 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-49fwh" event={"ID":"e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d","Type":"ContainerStarted","Data":"b7c534634268bfe1214276f62d46f1c7a48cbd666da5232bae806474ba522954"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.516271 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.517932 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.017914628 +0000 UTC m=+154.623867770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.531618 4799 patch_prober.go:28] interesting pod/downloads-7954f5f757-49fwh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.531664 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-49fwh" podUID="e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.533845 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" event={"ID":"c676b337-d700-423f-9fdc-d82946d7e0c4","Type":"ContainerStarted","Data":"ffefe7f13892b0ce321e142202bb1c3fed1dcc0f6fcfc37b4fd956a15948d491"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.560850 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" event={"ID":"0af95971-f650-4f4a-994e-72e58a5ba378","Type":"ContainerStarted","Data":"c8942ad31329d5ccaa7bcd8d936dde77c36c8b6a4f11e96e4beb77ea8280afa2"} Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.563757 4799 patch_prober.go:28] interesting pod/console-operator-58897d9998-wjwdj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.563788 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" podUID="d00dca6c-4b7a-4e1a-978b-44e97cba491a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.564252 4799 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zxq6v container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.564297 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.615271 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.616900 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.11687939 +0000 UTC m=+154.722832462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.673285 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" podStartSLOduration=90.673264746 podStartE2EDuration="1m30.673264746s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.672460257 +0000 UTC m=+154.278413319" watchObservedRunningTime="2026-03-19 20:07:56.673264746 +0000 UTC m=+154.279217818" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.716268 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.717379 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.717866 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.217849459 +0000 UTC m=+154.823802531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.778208 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zxk4t" podStartSLOduration=90.778189016 podStartE2EDuration="1m30.778189016s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.744094587 +0000 UTC m=+154.350047659" watchObservedRunningTime="2026-03-19 20:07:56.778189016 +0000 UTC m=+154.384142088" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.782306 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pqjmc" podStartSLOduration=90.782297482 podStartE2EDuration="1m30.782297482s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.776415065 +0000 UTC m=+154.382368137" watchObservedRunningTime="2026-03-19 20:07:56.782297482 +0000 UTC m=+154.388250554" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.794760 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" podStartSLOduration=5.793356758 podStartE2EDuration="5.793356758s" podCreationTimestamp="2026-03-19 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.790526802 +0000 UTC m=+154.396479874" watchObservedRunningTime="2026-03-19 20:07:56.793356758 +0000 UTC m=+154.399309830" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.819791 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.820319 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.320293612 +0000 UTC m=+154.926246684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.889013 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" podStartSLOduration=90.888982813 podStartE2EDuration="1m30.888982813s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.834542342 +0000 UTC m=+154.440495434" watchObservedRunningTime="2026-03-19 20:07:56.888982813 +0000 UTC m=+154.494935885" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.917039 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" podStartSLOduration=90.917012962 podStartE2EDuration="1m30.917012962s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.911481804 +0000 UTC m=+154.517434876" watchObservedRunningTime="2026-03-19 20:07:56.917012962 +0000 UTC m=+154.522966034" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.921474 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:56 crc kubenswrapper[4799]: E0319 20:07:56.921813 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.421798853 +0000 UTC m=+155.027751925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.970098 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:07:56 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:07:56 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:07:56 crc kubenswrapper[4799]: healthz check failed Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.970166 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:07:56 crc kubenswrapper[4799]: I0319 20:07:56.972059 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lq2lw" podStartSLOduration=90.972047997 podStartE2EDuration="1m30.972047997s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:56.971085934 +0000 UTC m=+154.577038996" watchObservedRunningTime="2026-03-19 20:07:56.972047997 +0000 UTC m=+154.578001069" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.023760 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.025640 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.525612807 +0000 UTC m=+155.131565879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.114944 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5dnp8" podStartSLOduration=91.114917016 podStartE2EDuration="1m31.114917016s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.113166255 +0000 UTC m=+154.719119327" watchObservedRunningTime="2026-03-19 20:07:57.114917016 +0000 UTC m=+154.720870088" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.128390 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.129002 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.628986212 +0000 UTC m=+155.234939274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.225160 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-942vf" podStartSLOduration=91.225145239 podStartE2EDuration="1m31.225145239s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.224844052 +0000 UTC m=+154.830797124" watchObservedRunningTime="2026-03-19 20:07:57.225145239 +0000 UTC m=+154.831098311" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.226971 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zblhc" podStartSLOduration=91.226965851 podStartE2EDuration="1m31.226965851s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.178824346 +0000 UTC m=+154.784777418" watchObservedRunningTime="2026-03-19 20:07:57.226965851 +0000 UTC m=+154.832918923" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.232186 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.232996 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.732973291 +0000 UTC m=+155.338926363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.318834 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z5fgk" podStartSLOduration=91.318815669 podStartE2EDuration="1m31.318815669s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.317432637 +0000 UTC m=+154.923385709" watchObservedRunningTime="2026-03-19 20:07:57.318815669 +0000 UTC m=+154.924768741" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.338942 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.339433 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.839416606 +0000 UTC m=+155.445369678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.444178 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.444957 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:57.94494127 +0000 UTC m=+155.550894342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.472242 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" podStartSLOduration=91.472226742 podStartE2EDuration="1m31.472226742s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.448672097 +0000 UTC m=+155.054625159" watchObservedRunningTime="2026-03-19 20:07:57.472226742 +0000 UTC m=+155.078179814" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.502498 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dcbvh" podStartSLOduration=91.502483903 podStartE2EDuration="1m31.502483903s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.473914251 +0000 UTC m=+155.079867323" watchObservedRunningTime="2026-03-19 20:07:57.502483903 +0000 UTC m=+155.108436975" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.518381 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fhbnk" podStartSLOduration=91.518363391 podStartE2EDuration="1m31.518363391s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.516763204 +0000 UTC m=+155.122716276" watchObservedRunningTime="2026-03-19 20:07:57.518363391 +0000 UTC m=+155.124316463" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.547147 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.547485 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.047470865 +0000 UTC m=+155.653423937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.562970 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" podStartSLOduration=91.562955934 podStartE2EDuration="1m31.562955934s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.562734799 +0000 UTC m=+155.168687871" watchObservedRunningTime="2026-03-19 20:07:57.562955934 +0000 UTC m=+155.168909006" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.585206 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-dzmpr"] Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.606446 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2xv74" event={"ID":"5e87733e-0b51-49dc-b43c-74479aa30aa2","Type":"ContainerStarted","Data":"21808d6b806b0015032920ea2ee71544712d4555bf56c12aa16b31441330bfd0"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.635420 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" podStartSLOduration=91.635399742 podStartE2EDuration="1m31.635399742s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.632943685 +0000 UTC m=+155.238896757" watchObservedRunningTime="2026-03-19 20:07:57.635399742 +0000 UTC m=+155.241352814" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.640986 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" event={"ID":"c555a523-2329-4b06-be49-cb3ffa1e5212","Type":"ContainerStarted","Data":"488eec01c91a7bbb107f174a12a1f5dbcfcfad69f82fd12e50bc541d71ad2bb9"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.648033 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.649473 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.149455108 +0000 UTC m=+155.755408180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.665944 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" event={"ID":"0af95971-f650-4f4a-994e-72e58a5ba378","Type":"ContainerStarted","Data":"a248c023f1eb1a98261a8be045e224666b3c1658820c71ceb7de84ee850c03a2"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.667904 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qb26c" podStartSLOduration=91.667890835 podStartE2EDuration="1m31.667890835s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.665778166 +0000 UTC m=+155.271731238" watchObservedRunningTime="2026-03-19 20:07:57.667890835 +0000 UTC m=+155.273843907" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.691693 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lcd4j" event={"ID":"46dfa0a6-ed23-4430-84b6-f0652be25261","Type":"ContainerStarted","Data":"281465125ead1801b8a637ca8acdfaef45d3a4fd96cfebacfe0fcd6777820c64"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.712265 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxzt7" podStartSLOduration=91.712248552 podStartE2EDuration="1m31.712248552s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.709788435 +0000 UTC m=+155.315741507" watchObservedRunningTime="2026-03-19 20:07:57.712248552 +0000 UTC m=+155.318201624" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.726769 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" event={"ID":"8d86bd00-091f-4a51-ac44-af09c78f1857","Type":"ContainerStarted","Data":"4c8bf8f86eaacd639b0ee75223c7a3a7153c79d3468ef1939230e493e43faa47"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.727073 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" event={"ID":"8d86bd00-091f-4a51-ac44-af09c78f1857","Type":"ContainerStarted","Data":"a4e30e3c045b4cc662a8a46e60c8b6a612868319b187b3ed07f1187fd11bd1d5"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.727083 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" event={"ID":"8d86bd00-091f-4a51-ac44-af09c78f1857","Type":"ContainerStarted","Data":"1e860ea1b1b0a774df1e6905ff22238d7dba11ae0ffa330a601f9db423d3b9bd"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.748491 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" event={"ID":"38afdf16-38e4-47b5-b3d7-aa040962429d","Type":"ContainerStarted","Data":"51b9bfae24d83e26da5c69122f60b32f5568f8d9ae2df8954fd4c1ce1d61fd3d"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.750417 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hqtdl" podStartSLOduration=91.750382685 podStartE2EDuration="1m31.750382685s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.748778398 +0000 UTC m=+155.354731470" watchObservedRunningTime="2026-03-19 20:07:57.750382685 +0000 UTC m=+155.356335757" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.779545 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.780081 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.280061503 +0000 UTC m=+155.886014565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.782784 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" event={"ID":"af651426-59b2-46f8-9e60-011b0d134c7d","Type":"ContainerStarted","Data":"cf6d09df5becb3bb58e9d02153b36b0abdcceb40a4da1e647f953d2a7bc01494"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.800250 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9rsf6" event={"ID":"f0dce6fd-e631-404f-8864-6c7f0fcbd52f","Type":"ContainerStarted","Data":"c5bd06e8470c45b6b2c07000fca190b7c6745c8f2d065404173db32aff87f211"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.800303 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9rsf6" event={"ID":"f0dce6fd-e631-404f-8864-6c7f0fcbd52f","Type":"ContainerStarted","Data":"88b391806eed5b6757114507055ae00d5c504f2a8c9932e0f4d3deba1f7bab93"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.818810 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-49fwh" podStartSLOduration=91.81877654 podStartE2EDuration="1m31.81877654s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.793856072 +0000 UTC m=+155.399809134" watchObservedRunningTime="2026-03-19 20:07:57.81877654 +0000 UTC m=+155.424729612" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.843535 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" event={"ID":"a53dfbf1-613e-4465-ae93-b9a146216ec4","Type":"ContainerStarted","Data":"13530dadd451008aa0e97d08bc337855c8fbe164f76056a92047997923f640a0"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.850211 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" event={"ID":"0d0d7527-3b92-4479-80b9-4a71e7c1054b","Type":"ContainerStarted","Data":"bb896324bc26f4a518ed9d9cfa82a40e604f73e61b8dd49398913d17d852f30e"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.871855 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gksq8" event={"ID":"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2","Type":"ContainerStarted","Data":"d52347f58fda65727759b906aabb7b52244ddb30845881c0fa1d59b0a6c53261"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.871926 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gksq8" event={"ID":"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2","Type":"ContainerStarted","Data":"340ecc4364afc61cde2cf4ce327f4f3f347343c6b671684a57b7a8633eeeb423"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.881714 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.883280 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.383263843 +0000 UTC m=+155.989216915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.906944 4799 generic.go:334] "Generic (PLEG): container finished" podID="5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7" containerID="a5209069baa18c02e19ea39a40f3b6ac646484b613b27dd0b85d2cd4a283d263" exitCode=0 Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.907066 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" event={"ID":"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7","Type":"ContainerDied","Data":"a5209069baa18c02e19ea39a40f3b6ac646484b613b27dd0b85d2cd4a283d263"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.907100 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" event={"ID":"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7","Type":"ContainerStarted","Data":"16dd342e52b08cc97de0a8f4555db407d3e972c5a52f7c0e6af1acef9d0055a0"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.912671 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" podStartSLOduration=91.912643134 podStartE2EDuration="1m31.912643134s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.877827207 +0000 UTC m=+155.483780279" watchObservedRunningTime="2026-03-19 20:07:57.912643134 +0000 UTC m=+155.518596206" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.941956 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lcd4j" podStartSLOduration=6.941929342 podStartE2EDuration="6.941929342s" podCreationTimestamp="2026-03-19 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:57.940835847 +0000 UTC m=+155.546788909" watchObservedRunningTime="2026-03-19 20:07:57.941929342 +0000 UTC m=+155.547882424" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.951973 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" event={"ID":"eb2fa492-7952-4155-9f40-bb6f9c569a4f","Type":"ContainerStarted","Data":"17f1961c19f30563f98e3b490e20ccd2f50c4cf7224d7fa5cc2ecd93e6a42e06"} Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.952463 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.970605 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:07:57 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:07:57 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:07:57 crc kubenswrapper[4799]: healthz check failed Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.970665 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.971860 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lf2lf"] Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.978127 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.986691 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.987054 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:57 crc kubenswrapper[4799]: E0319 20:07:57.987590 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.487570709 +0000 UTC m=+156.093523981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:57 crc kubenswrapper[4799]: I0319 20:07:57.987947 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lf2lf"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.013020 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" event={"ID":"0b71b47b-d667-49c1-ae5b-3326bcde5508","Type":"ContainerStarted","Data":"cac641f2cd20a9ecf5625126deadecd1af5300bef7487727572c2db3a7595fd5"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.013479 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" event={"ID":"0b71b47b-d667-49c1-ae5b-3326bcde5508","Type":"ContainerStarted","Data":"0ca8dcaffa923f1563a3ec1b85a51abd6c162fecad74f71682b763dfd0e0b438"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.013493 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" event={"ID":"0b71b47b-d667-49c1-ae5b-3326bcde5508","Type":"ContainerStarted","Data":"96758e9fa4114e5861f340ac6a4788cfb288a71e1a64191c218e9f516e4e6aa1"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.041638 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" event={"ID":"01bc1ca8-80c6-4b25-9cd5-ff33929f35d1","Type":"ContainerStarted","Data":"5c84bc3253208fe3ca14bd6a07dcab5d0790d1a1d53ae2e312290a88b232c843"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.043263 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.077958 4799 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rrsbl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.078005 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" podUID="01bc1ca8-80c6-4b25-9cd5-ff33929f35d1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.083044 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" event={"ID":"fd7aca22-2318-490a-bf8c-33e72bc84a8c","Type":"ContainerStarted","Data":"801ab003423f22fbb5fe5d89255830fa88ce3c191987a57290daeb0da0855009"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.083159 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" event={"ID":"fd7aca22-2318-490a-bf8c-33e72bc84a8c","Type":"ContainerStarted","Data":"943071457a4a97ed331e26b6b80027408d91ab59f1c3abd6672e7048f09aa386"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.083241 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" event={"ID":"fd7aca22-2318-490a-bf8c-33e72bc84a8c","Type":"ContainerStarted","Data":"86e2aa06362016d9707a61b1fb8dc070aa81d089005fb2db6c9dfc9b93b525a4"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.088852 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.089036 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtwfc\" (UniqueName: \"kubernetes.io/projected/0dfa7f99-f171-41a9-8931-07caaeaa06e1-kube-api-access-wtwfc\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.089177 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-catalog-content\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.089210 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-utilities\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.090032 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.590013792 +0000 UTC m=+156.195966864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.101405 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" event={"ID":"8ee614fe-98c9-4ae4-9233-dfe531cc54f9","Type":"ContainerStarted","Data":"3e7ba9f722f14d5a3dcd2aae60f2905dd776f3223d7f7336b720a3b163e2323d"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.101468 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" event={"ID":"8ee614fe-98c9-4ae4-9233-dfe531cc54f9","Type":"ContainerStarted","Data":"9ad8583d2f5a0bb488d3962b71333135c1ff5c633a1529e1feee665170561b2f"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.101489 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" event={"ID":"8ee614fe-98c9-4ae4-9233-dfe531cc54f9","Type":"ContainerStarted","Data":"8346b87b4e9f45b98807087237e12e6a3cc1686e0baee58b95da60629a1b89ea"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.101602 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.107057 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" event={"ID":"be6d29c8-3af3-4395-accc-c87b7685791e","Type":"ContainerStarted","Data":"49db5c641e13d3315a8fb90367f5e97686aef962138f43935da5efcdb83fd987"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.107969 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.120588 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-drzzt" podStartSLOduration=92.12056463 podStartE2EDuration="1m32.12056463s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.120265463 +0000 UTC m=+155.726218535" watchObservedRunningTime="2026-03-19 20:07:58.12056463 +0000 UTC m=+155.726517702" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.121352 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p7f89" podStartSLOduration=92.121345988 podStartE2EDuration="1m32.121345988s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.063340194 +0000 UTC m=+155.669293266" watchObservedRunningTime="2026-03-19 20:07:58.121345988 +0000 UTC m=+155.727299060" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.128054 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" event={"ID":"e9c04f3d-48cb-4f6c-b3d5-b7f3fa554a84","Type":"ContainerStarted","Data":"e86e00fed3ade012c302448830a9cc68e3150176449b8b23a9197cc3cab26cc4"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.131655 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.134012 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" event={"ID":"f8720bb6-e6ea-43b3-a750-d0b7c1221266","Type":"ContainerStarted","Data":"b9e9a56806475d735ebce84ef1fa1d7783c9dcff70f00bcfab9b6d2c920fe173"} Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.136805 4799 patch_prober.go:28] interesting pod/downloads-7954f5f757-49fwh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.136903 4799 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wjn4f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/healthz\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.136953 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.15:8080/healthz\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.136923 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-49fwh" podUID="e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.148164 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.158862 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.177105 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2gp4f"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.186883 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.190635 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-catalog-content\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.190795 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-utilities\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.191016 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.191041 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtwfc\" (UniqueName: \"kubernetes.io/projected/0dfa7f99-f171-41a9-8931-07caaeaa06e1-kube-api-access-wtwfc\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.195059 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-catalog-content\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.195695 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.69567338 +0000 UTC m=+156.301626452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.214045 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-utilities\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.216056 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.257404 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wjwdj" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.262848 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2gp4f"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.285270 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9pr5k" podStartSLOduration=92.285239994 podStartE2EDuration="1m32.285239994s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.282104482 +0000 UTC m=+155.888057554" watchObservedRunningTime="2026-03-19 20:07:58.285239994 +0000 UTC m=+155.891193056" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.302944 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.303182 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfj45\" (UniqueName: \"kubernetes.io/projected/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-kube-api-access-bfj45\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.303223 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-catalog-content\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.303241 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-utilities\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.303499 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.803462546 +0000 UTC m=+156.409415618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.309681 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtwfc\" (UniqueName: \"kubernetes.io/projected/0dfa7f99-f171-41a9-8931-07caaeaa06e1-kube-api-access-wtwfc\") pod \"certified-operators-lf2lf\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.338779 4799 ???:1] "http: TLS handshake error from 192.168.126.11:44956: no serving certificate available for the kubelet" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.356251 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" podStartSLOduration=92.356232519 podStartE2EDuration="1m32.356232519s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.339792578 +0000 UTC m=+155.945745650" watchObservedRunningTime="2026-03-19 20:07:58.356232519 +0000 UTC m=+155.962185591" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.358459 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sq5wd"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.359956 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.373654 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq5wd"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.388102 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9rsf6" podStartSLOduration=7.388084107 podStartE2EDuration="7.388084107s" podCreationTimestamp="2026-03-19 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.387031622 +0000 UTC m=+155.992984694" watchObservedRunningTime="2026-03-19 20:07:58.388084107 +0000 UTC m=+155.994037169" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407177 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfj45\" (UniqueName: \"kubernetes.io/projected/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-kube-api-access-bfj45\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407218 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-catalog-content\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407256 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-catalog-content\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407274 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-utilities\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407289 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-utilities\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407325 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.407359 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxbt\" (UniqueName: \"kubernetes.io/projected/9365ab34-43df-42ee-bc04-baeba5579717-kube-api-access-bdxbt\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.408782 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-catalog-content\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.408995 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-utilities\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.409228 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:58.909218106 +0000 UTC m=+156.515171178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.430758 4799 ???:1] "http: TLS handshake error from 192.168.126.11:44958: no serving certificate available for the kubelet" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.457986 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfj45\" (UniqueName: \"kubernetes.io/projected/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-kube-api-access-bfj45\") pod \"community-operators-2gp4f\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.507552 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.512650 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.514298 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.514684 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-catalog-content\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.514685 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-475c8" podStartSLOduration=92.514664939 podStartE2EDuration="1m32.514664939s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.453719857 +0000 UTC m=+156.059672939" watchObservedRunningTime="2026-03-19 20:07:58.514664939 +0000 UTC m=+156.120618011" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.514753 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-utilities\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.515052 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxbt\" (UniqueName: \"kubernetes.io/projected/9365ab34-43df-42ee-bc04-baeba5579717-kube-api-access-bdxbt\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.515914 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-catalog-content\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.515926 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.015908177 +0000 UTC m=+156.621861249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.516372 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-utilities\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.535116 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.597244 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxbt\" (UniqueName: \"kubernetes.io/projected/9365ab34-43df-42ee-bc04-baeba5579717-kube-api-access-bdxbt\") pod \"certified-operators-sq5wd\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.605670 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.616245 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.616575 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.116561739 +0000 UTC m=+156.722514811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.630983 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" podStartSLOduration=92.630954902 podStartE2EDuration="1m32.630954902s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.586758568 +0000 UTC m=+156.192711640" watchObservedRunningTime="2026-03-19 20:07:58.630954902 +0000 UTC m=+156.236907974" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.639881 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fgtxw"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.641743 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.649645 4799 ???:1] "http: TLS handshake error from 192.168.126.11:44972: no serving certificate available for the kubelet" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.670675 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.708454 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.713201 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgtxw"] Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.722943 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.723118 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.223101157 +0000 UTC m=+156.829054229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.723222 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7t2\" (UniqueName: \"kubernetes.io/projected/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-kube-api-access-zq7t2\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.723263 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-utilities\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.723282 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-catalog-content\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.723307 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.723784 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.223759652 +0000 UTC m=+156.829712724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.733764 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fkg75" podStartSLOduration=92.733735433 podStartE2EDuration="1m32.733735433s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.730906827 +0000 UTC m=+156.336859899" watchObservedRunningTime="2026-03-19 20:07:58.733735433 +0000 UTC m=+156.339688505" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.734600 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.824552 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.825381 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7t2\" (UniqueName: \"kubernetes.io/projected/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-kube-api-access-zq7t2\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.825459 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-utilities\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.825479 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-catalog-content\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.826051 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.326026821 +0000 UTC m=+156.931979893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.827598 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-utilities\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.827819 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-catalog-content\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.926617 4799 ???:1] "http: TLS handshake error from 192.168.126.11:44976: no serving certificate available for the kubelet" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.927559 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-x5czz" podStartSLOduration=92.927542752 podStartE2EDuration="1m32.927542752s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:58.814282759 +0000 UTC m=+156.420235831" watchObservedRunningTime="2026-03-19 20:07:58.927542752 +0000 UTC m=+156.533495824" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.928160 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:58 crc kubenswrapper[4799]: E0319 20:07:58.928852 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.428835782 +0000 UTC m=+157.034788854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.945004 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7t2\" (UniqueName: \"kubernetes.io/projected/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-kube-api-access-zq7t2\") pod \"community-operators-fgtxw\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.965659 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.968518 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:07:58 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:07:58 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:07:58 crc kubenswrapper[4799]: healthz check failed Mar 19 20:07:58 crc kubenswrapper[4799]: I0319 20:07:58.968579 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.014137 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" podStartSLOduration=93.014119798 podStartE2EDuration="1m33.014119798s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.011889576 +0000 UTC m=+156.617842648" watchObservedRunningTime="2026-03-19 20:07:59.014119798 +0000 UTC m=+156.620072870" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.030050 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.030213 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.530179399 +0000 UTC m=+157.136132481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.030283 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.030691 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.530681241 +0000 UTC m=+157.136634313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.124194 4799 ???:1] "http: TLS handshake error from 192.168.126.11:44984: no serving certificate available for the kubelet" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.146101 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.151234 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" podStartSLOduration=93.151213743 podStartE2EDuration="1m33.151213743s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.150013525 +0000 UTC m=+156.755966617" watchObservedRunningTime="2026-03-19 20:07:59.151213743 +0000 UTC m=+156.757166805" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.154206 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.654174502 +0000 UTC m=+157.260127574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.160256 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwsz" podStartSLOduration=93.160201941 podStartE2EDuration="1m33.160201941s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.107139392 +0000 UTC m=+156.713092464" watchObservedRunningTime="2026-03-19 20:07:59.160201941 +0000 UTC m=+156.766155013" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.161506 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.162295 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.662280089 +0000 UTC m=+157.268233161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.201051 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.267775 4799 ???:1] "http: TLS handshake error from 192.168.126.11:44990: no serving certificate available for the kubelet" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.267908 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" event={"ID":"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7","Type":"ContainerStarted","Data":"e61e5d19cf0ed2d47de82a913d4bc329a74b1f3a45baa352cf7a0762a626b1dc"} Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.291234 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6ddkj" podStartSLOduration=93.291218386 podStartE2EDuration="1m33.291218386s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.268178322 +0000 UTC m=+156.874131394" watchObservedRunningTime="2026-03-19 20:07:59.291218386 +0000 UTC m=+156.897171458" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.292085 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5lgmn" podStartSLOduration=93.292078076 podStartE2EDuration="1m33.292078076s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.207948107 +0000 UTC m=+156.813901179" watchObservedRunningTime="2026-03-19 20:07:59.292078076 +0000 UTC m=+156.898031148" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.295792 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.317468 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.817416483 +0000 UTC m=+157.423369555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.325288 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9h5w" podStartSLOduration=93.325266365 podStartE2EDuration="1m33.325266365s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.307252597 +0000 UTC m=+156.913205669" watchObservedRunningTime="2026-03-19 20:07:59.325266365 +0000 UTC m=+156.931219477" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.329401 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mtvrz" event={"ID":"0af95971-f650-4f4a-994e-72e58a5ba378","Type":"ContainerStarted","Data":"913812a8e7fb01cb143175905af5e7fd6a766ea852e21791444fd71278ea2252"} Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.410135 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.413702 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gksq8" event={"ID":"f3ac81e4-55c6-479f-85c7-bc28c56aa3a2","Type":"ContainerStarted","Data":"7b9366309528cd35b4e1888c0cee9d02c665bab8529d394a71a7f888c2c02f2a"} Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.414801 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gksq8" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.415691 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:07:59.915675999 +0000 UTC m=+157.521629071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.448066 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" event={"ID":"38afdf16-38e4-47b5-b3d7-aa040962429d","Type":"ContainerStarted","Data":"8e91c9e312ff671c1097cdcae495ad358c0712f35d700f03a82513c6f2142f6d"} Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.462739 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" gracePeriod=30 Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.476409 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.484724 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgsz2" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.514288 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.514454 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.014427416 +0000 UTC m=+157.620380488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.519237 4799 ???:1] "http: TLS handshake error from 192.168.126.11:45004: no serving certificate available for the kubelet" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.526537 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.530573 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.03055409 +0000 UTC m=+157.636507162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.562782 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2gp4f"] Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.619573 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.619554511 podStartE2EDuration="619.554511ms" podCreationTimestamp="2026-03-19 20:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.588516692 +0000 UTC m=+157.194469764" watchObservedRunningTime="2026-03-19 20:07:59.619554511 +0000 UTC m=+157.225507583" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.628504 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.629075 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.129051451 +0000 UTC m=+157.735004523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.711921 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6mk7s" podStartSLOduration=93.71189487 podStartE2EDuration="1m33.71189487s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.710469567 +0000 UTC m=+157.316422639" watchObservedRunningTime="2026-03-19 20:07:59.71189487 +0000 UTC m=+157.317847942" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.731822 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.732293 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.232276352 +0000 UTC m=+157.838229414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.785021 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" podStartSLOduration=93.784994703 podStartE2EDuration="1m33.784994703s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:07:59.749403899 +0000 UTC m=+157.355356971" watchObservedRunningTime="2026-03-19 20:07:59.784994703 +0000 UTC m=+157.390947775" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.835212 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.835843 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.335821891 +0000 UTC m=+157.941774963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.876480 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lf2lf"] Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.914223 4799 ???:1] "http: TLS handshake error from 192.168.126.11:51554: no serving certificate available for the kubelet" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.933601 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgtxw"] Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.936774 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.936858 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:07:59 crc kubenswrapper[4799]: E0319 20:07:59.938044 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.438014798 +0000 UTC m=+158.043968050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:07:59 crc kubenswrapper[4799]: I0319 20:07:59.966706 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.003640 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:00 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:00 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:00 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.003733 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.040298 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.042131 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.542087668 +0000 UTC m=+158.148040740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.044659 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.044735 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.044771 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.044819 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.045026 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.045449 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.545433266 +0000 UTC m=+158.151386338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.049914 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.057504 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.060263 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.062568 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.064571 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrsbl" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.068010 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4g9c"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.070057 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.071089 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.081904 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21434d03-6102-44f2-bb92-e1cb4efc47f9-metrics-certs\") pod \"network-metrics-daemon-sndjj\" (UID: \"21434d03-6102-44f2-bb92-e1cb4efc47f9\") " pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.094594 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.123434 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq5wd"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.138639 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sndjj" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.149592 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4g9c"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.150028 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.150285 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-utilities\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.150352 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.650325375 +0000 UTC m=+158.256278447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.150429 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.150454 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-catalog-content\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.150496 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb5dx\" (UniqueName: \"kubernetes.io/projected/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-kube-api-access-cb5dx\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.150763 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.650751424 +0000 UTC m=+158.256704496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.165932 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zxq6v"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.255964 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.256180 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-utilities\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.256250 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-catalog-content\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.256299 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb5dx\" (UniqueName: \"kubernetes.io/projected/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-kube-api-access-cb5dx\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.256702 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.756687078 +0000 UTC m=+158.362640150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.257053 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-utilities\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.257701 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-catalog-content\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.274180 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wnspm" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.277660 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565848-5gcs8"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.278273 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.300760 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.300926 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.300958 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.312893 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb5dx\" (UniqueName: \"kubernetes.io/projected/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-kube-api-access-cb5dx\") pod \"redhat-marketplace-m4g9c\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.322841 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-5gcs8"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.342587 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.349519 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gksq8" podStartSLOduration=9.349502328 podStartE2EDuration="9.349502328s" podCreationTimestamp="2026-03-19 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:00.349181371 +0000 UTC m=+157.955134443" watchObservedRunningTime="2026-03-19 20:08:00.349502328 +0000 UTC m=+157.955455400" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.357492 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.357541 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw98s\" (UniqueName: \"kubernetes.io/projected/31aa7077-55c5-426b-a92f-c93b8d767105-kube-api-access-fw98s\") pod \"auto-csr-approver-29565848-5gcs8\" (UID: \"31aa7077-55c5-426b-a92f-c93b8d767105\") " pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.357888 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.857871182 +0000 UTC m=+158.463824254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.428299 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2587j"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.434236 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.455043 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.459163 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.459465 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw98s\" (UniqueName: \"kubernetes.io/projected/31aa7077-55c5-426b-a92f-c93b8d767105-kube-api-access-fw98s\") pod \"auto-csr-approver-29565848-5gcs8\" (UID: \"31aa7077-55c5-426b-a92f-c93b8d767105\") " pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.460278 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:00.960263164 +0000 UTC m=+158.566216236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.481572 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2587j"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.540431 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw98s\" (UniqueName: \"kubernetes.io/projected/31aa7077-55c5-426b-a92f-c93b8d767105-kube-api-access-fw98s\") pod \"auto-csr-approver-29565848-5gcs8\" (UID: \"31aa7077-55c5-426b-a92f-c93b8d767105\") " pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.561898 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerStarted","Data":"23ed53307635e45e4bd4ce99a1064af92130379499e66d59bec054dd86acb90f"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.561942 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerStarted","Data":"190353479dd68dc5e8d42761ffede034a1ae51f50e6154cfc35cea929e55a255"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.563411 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-catalog-content\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.563441 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.563460 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-utilities\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.563477 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ks6\" (UniqueName: \"kubernetes.io/projected/af016f47-a261-463b-98e4-710fdb4557bb-kube-api-access-97ks6\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.563778 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.063764581 +0000 UTC m=+158.669717653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.589400 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq5wd" event={"ID":"9365ab34-43df-42ee-bc04-baeba5579717","Type":"ContainerStarted","Data":"25516d56582951a4797b5957b0746f1e1d95111f53fa8d8231b25a848973291a"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.649696 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.661890 4799 generic.go:334] "Generic (PLEG): container finished" podID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerID="0af840e72eff2e61f1380262ed655205ee94d6cba83fd43e9c8dbe2b1b6234ac" exitCode=0 Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.661964 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gp4f" event={"ID":"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40","Type":"ContainerDied","Data":"0af840e72eff2e61f1380262ed655205ee94d6cba83fd43e9c8dbe2b1b6234ac"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.661996 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gp4f" event={"ID":"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40","Type":"ContainerStarted","Data":"645fbaf0afe66d5d1c6c4939df667c813994072ef6a7d9530571b233a15ea2d4"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.664083 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.664243 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-catalog-content\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.664278 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-utilities\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.664298 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ks6\" (UniqueName: \"kubernetes.io/projected/af016f47-a261-463b-98e4-710fdb4557bb-kube-api-access-97ks6\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.664848 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-catalog-content\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.665477 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-utilities\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.665529 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.165512658 +0000 UTC m=+158.771465730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.686315 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ks6\" (UniqueName: \"kubernetes.io/projected/af016f47-a261-463b-98e4-710fdb4557bb-kube-api-access-97ks6\") pod \"redhat-marketplace-2587j\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.696250 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" event={"ID":"5a75f44b-8b2c-4bdd-bc33-ccf784a9e7b7","Type":"ContainerStarted","Data":"337e552d1a26c62e1c060465914058909bbaf0c2cbe327c7fe25808cc2bcc7b0"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.703374 4799 ???:1] "http: TLS handshake error from 192.168.126.11:51562: no serving certificate available for the kubelet" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.711510 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.736964 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" event={"ID":"38afdf16-38e4-47b5-b3d7-aa040962429d","Type":"ContainerStarted","Data":"ee9bbe98098a1aa831989a1fb0e58c90b5e9479efe07f230d8e0b6fdf8fcbb73"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.745887 4799 generic.go:334] "Generic (PLEG): container finished" podID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerID="fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29" exitCode=0 Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.746737 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerDied","Data":"fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.746759 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerStarted","Data":"fa157baccf7919a8da3e176ae4b43a5994f28542a2387ed33ec57050a326defb"} Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.748074 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerName="controller-manager" containerID="cri-o://851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72" gracePeriod=30 Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.766416 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.778377 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.278362572 +0000 UTC m=+158.884315644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.845842 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sndjj"] Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.869427 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.879325 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.37929315 +0000 UTC m=+158.985246232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.898491 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.966361 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:00 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:00 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:00 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.966493 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.972709 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:00 crc kubenswrapper[4799]: E0319 20:08:00.973035 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.473022111 +0000 UTC m=+159.078975183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:00 crc kubenswrapper[4799]: I0319 20:08:00.974548 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" podStartSLOduration=94.974529266 podStartE2EDuration="1m34.974529266s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:00.970559924 +0000 UTC m=+158.576512996" watchObservedRunningTime="2026-03-19 20:08:00.974529266 +0000 UTC m=+158.580482338" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.074199 4799 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.075232 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:01 crc kubenswrapper[4799]: E0319 20:08:01.075657 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.575637228 +0000 UTC m=+159.181590310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.177327 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k2dks"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.178613 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.179324 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.185850 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2dks"] Mar 19 20:08:01 crc kubenswrapper[4799]: E0319 20:08:01.186296 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.68627413 +0000 UTC m=+159.292227192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nbgk7" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.195502 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.224539 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4g9c"] Mar 19 20:08:01 crc kubenswrapper[4799]: W0319 20:08:01.225608 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-57606adc1910a6de7fe0cbfb4e2ace15806e47620f91453d3ae1c0c8df4ba509 WatchSource:0}: Error finding container 57606adc1910a6de7fe0cbfb4e2ace15806e47620f91453d3ae1c0c8df4ba509: Status 404 returned error can't find the container with id 57606adc1910a6de7fe0cbfb4e2ace15806e47620f91453d3ae1c0c8df4ba509 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.247804 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-5gcs8"] Mar 19 20:08:01 crc kubenswrapper[4799]: W0319 20:08:01.271020 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31aa7077_55c5_426b_a92f_c93b8d767105.slice/crio-a5c2b77e039b2bf67c04609cec251c960d92570f2e04adfd7e3c76452fd9f1e2 WatchSource:0}: Error finding container a5c2b77e039b2bf67c04609cec251c960d92570f2e04adfd7e3c76452fd9f1e2: Status 404 returned error can't find the container with id a5c2b77e039b2bf67c04609cec251c960d92570f2e04adfd7e3c76452fd9f1e2 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.281573 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.281805 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-utilities\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.281857 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-catalog-content\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.281888 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5npwg\" (UniqueName: \"kubernetes.io/projected/3b719eba-9287-4b38-9749-f5c2e09e32e4-kube-api-access-5npwg\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: E0319 20:08:01.281984 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 20:08:01.781967157 +0000 UTC m=+159.387920229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 20:08:01 crc kubenswrapper[4799]: W0319 20:08:01.288325 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58b5c1c_2c25_4bb0_b457_7c30bc24ce4e.slice/crio-a4bc170b685aa67ff2b4435f94a9a0fe555087949ce845c0c24d65a6afa4d10f WatchSource:0}: Error finding container a4bc170b685aa67ff2b4435f94a9a0fe555087949ce845c0c24d65a6afa4d10f: Status 404 returned error can't find the container with id a4bc170b685aa67ff2b4435f94a9a0fe555087949ce845c0c24d65a6afa4d10f Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.307890 4799 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T20:08:01.074617194Z","Handler":null,"Name":""} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.336600 4799 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.336633 4799 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.382975 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.384663 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5npwg\" (UniqueName: \"kubernetes.io/projected/3b719eba-9287-4b38-9749-f5c2e09e32e4-kube-api-access-5npwg\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.384711 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.384743 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-utilities\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.384797 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-catalog-content\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.385166 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-catalog-content\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.385830 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-utilities\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.413063 4799 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.413100 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.428535 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5npwg\" (UniqueName: \"kubernetes.io/projected/3b719eba-9287-4b38-9749-f5c2e09e32e4-kube-api-access-5npwg\") pod \"redhat-operators-k2dks\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.501734 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nbgk7\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.504353 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2587j"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.546480 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.546921 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.549637 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q5qgk"] Mar 19 20:08:01 crc kubenswrapper[4799]: E0319 20:08:01.549809 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerName="controller-manager" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.549822 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerName="controller-manager" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.549922 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerName="controller-manager" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.550585 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.556103 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5qgk"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589107 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589164 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtxr\" (UniqueName: \"kubernetes.io/projected/fcc1241a-6888-4bcc-b784-474bc8865f63-kube-api-access-gjtxr\") pod \"fcc1241a-6888-4bcc-b784-474bc8865f63\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589241 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-config\") pod \"fcc1241a-6888-4bcc-b784-474bc8865f63\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589276 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc1241a-6888-4bcc-b784-474bc8865f63-serving-cert\") pod \"fcc1241a-6888-4bcc-b784-474bc8865f63\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589364 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-client-ca\") pod \"fcc1241a-6888-4bcc-b784-474bc8865f63\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589431 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-proxy-ca-bundles\") pod \"fcc1241a-6888-4bcc-b784-474bc8865f63\" (UID: \"fcc1241a-6888-4bcc-b784-474bc8865f63\") " Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589687 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqml\" (UniqueName: \"kubernetes.io/projected/9837fdbc-dcc3-4ba4-a168-a5ec29496871-kube-api-access-wdqml\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589737 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-catalog-content\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.589798 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-utilities\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.591190 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fcc1241a-6888-4bcc-b784-474bc8865f63" (UID: "fcc1241a-6888-4bcc-b784-474bc8865f63"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.593167 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-config" (OuterVolumeSpecName: "config") pod "fcc1241a-6888-4bcc-b784-474bc8865f63" (UID: "fcc1241a-6888-4bcc-b784-474bc8865f63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.600074 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-client-ca" (OuterVolumeSpecName: "client-ca") pod "fcc1241a-6888-4bcc-b784-474bc8865f63" (UID: "fcc1241a-6888-4bcc-b784-474bc8865f63"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.611101 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc1241a-6888-4bcc-b784-474bc8865f63-kube-api-access-gjtxr" (OuterVolumeSpecName: "kube-api-access-gjtxr") pod "fcc1241a-6888-4bcc-b784-474bc8865f63" (UID: "fcc1241a-6888-4bcc-b784-474bc8865f63"). InnerVolumeSpecName "kube-api-access-gjtxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.611172 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc1241a-6888-4bcc-b784-474bc8865f63-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fcc1241a-6888-4bcc-b784-474bc8865f63" (UID: "fcc1241a-6888-4bcc-b784-474bc8865f63"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.624356 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.646559 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692081 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-catalog-content\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692155 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-utilities\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692188 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqml\" (UniqueName: \"kubernetes.io/projected/9837fdbc-dcc3-4ba4-a168-a5ec29496871-kube-api-access-wdqml\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692261 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692276 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcc1241a-6888-4bcc-b784-474bc8865f63-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692284 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692295 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcc1241a-6888-4bcc-b784-474bc8865f63-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692305 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtxr\" (UniqueName: \"kubernetes.io/projected/fcc1241a-6888-4bcc-b784-474bc8865f63-kube-api-access-gjtxr\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.692887 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-catalog-content\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.695830 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-utilities\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.712282 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqml\" (UniqueName: \"kubernetes.io/projected/9837fdbc-dcc3-4ba4-a168-a5ec29496871-kube-api-access-wdqml\") pod \"redhat-operators-q5qgk\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.789747 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b5d163b1f7dacdbb38c8ee1f8cb08455160bb96e79a8bd74dbe5eea6e9b7a1f9"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.790083 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"57606adc1910a6de7fe0cbfb4e2ace15806e47620f91453d3ae1c0c8df4ba509"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.808422 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" event={"ID":"31aa7077-55c5-426b-a92f-c93b8d767105","Type":"ContainerStarted","Data":"a5c2b77e039b2bf67c04609cec251c960d92570f2e04adfd7e3c76452fd9f1e2"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.810980 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sndjj" event={"ID":"21434d03-6102-44f2-bb92-e1cb4efc47f9","Type":"ContainerStarted","Data":"c1ca36cc53e24dc4b117526396e9b38d1c5b7002750ff3d83e9f393d95e6f835"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.811014 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sndjj" event={"ID":"21434d03-6102-44f2-bb92-e1cb4efc47f9","Type":"ContainerStarted","Data":"b94f0f32bf048dd31b35fb0127726aa812d2408fc5132e1f75aba9e64afecdc5"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.812362 4799 generic.go:334] "Generic (PLEG): container finished" podID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerID="f2770ee94941c7eafee41b4b2c52c16dfb2ae9655036fff1828435ad8305240d" exitCode=0 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.812435 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerDied","Data":"f2770ee94941c7eafee41b4b2c52c16dfb2ae9655036fff1828435ad8305240d"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.812456 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerStarted","Data":"a4bc170b685aa67ff2b4435f94a9a0fe555087949ce845c0c24d65a6afa4d10f"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.837764 4799 generic.go:334] "Generic (PLEG): container finished" podID="fcc1241a-6888-4bcc-b784-474bc8865f63" containerID="851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72" exitCode=0 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.837847 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" event={"ID":"fcc1241a-6888-4bcc-b784-474bc8865f63","Type":"ContainerDied","Data":"851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.837891 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" event={"ID":"fcc1241a-6888-4bcc-b784-474bc8865f63","Type":"ContainerDied","Data":"6c983e470723989c2083ce3148a56465520f16fba7a8d7c309a4f4e03a952d3c"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.837907 4799 scope.go:117] "RemoveContainer" containerID="851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.838053 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zxq6v" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.850478 4799 generic.go:334] "Generic (PLEG): container finished" podID="9365ab34-43df-42ee-bc04-baeba5579717" containerID="7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4" exitCode=0 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.850652 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq5wd" event={"ID":"9365ab34-43df-42ee-bc04-baeba5579717","Type":"ContainerDied","Data":"7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.874274 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"190237cd453c1db0d6da172c6169e56f480aaac218139813fbfbebba99efb8c1"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.874347 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"507af5eb3e199b0cd1935150c66e364e57fafa9187ecafd4d6eec5613f968bda"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.874926 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.884315 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.886129 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zxq6v"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.893112 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2587j" event={"ID":"af016f47-a261-463b-98e4-710fdb4557bb","Type":"ContainerStarted","Data":"b5186b2ecd3795166c9ed949a997550c688cc125e71aa8ce53be0a41fd01837f"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.896092 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zxq6v"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.909159 4799 generic.go:334] "Generic (PLEG): container finished" podID="f8720bb6-e6ea-43b3-a750-d0b7c1221266" containerID="b9e9a56806475d735ebce84ef1fa1d7783c9dcff70f00bcfab9b6d2c920fe173" exitCode=0 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.909251 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" event={"ID":"f8720bb6-e6ea-43b3-a750-d0b7c1221266","Type":"ContainerDied","Data":"b9e9a56806475d735ebce84ef1fa1d7783c9dcff70f00bcfab9b6d2c920fe173"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.921620 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" event={"ID":"38afdf16-38e4-47b5-b3d7-aa040962429d","Type":"ContainerStarted","Data":"27801031e7e997fdee2b82a46906545daf64b02141eb3a325ca2070cefb1fc4d"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.925567 4799 generic.go:334] "Generic (PLEG): container finished" podID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerID="23ed53307635e45e4bd4ce99a1064af92130379499e66d59bec054dd86acb90f" exitCode=0 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.925651 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerDied","Data":"23ed53307635e45e4bd4ce99a1064af92130379499e66d59bec054dd86acb90f"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.930497 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"abb840f7ab528595dead9db375032985f7c707b9f0e7902e73beeb714dfa0c95"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.930566 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f82c2dbc76843c2a4691ae1943cff419f7b51270f09423afaa60dc3c2ceb8d36"} Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.931944 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" podUID="75926aed-864d-42ee-aabf-89e5579606a7" containerName="route-controller-manager" containerID="cri-o://28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3" gracePeriod=30 Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.947215 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.948284 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.950580 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.954824 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.955041 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.970336 4799 scope.go:117] "RemoveContainer" containerID="851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.970546 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:01 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:01 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:01 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.970589 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:01 crc kubenswrapper[4799]: E0319 20:08:01.972202 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72\": container with ID starting with 851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72 not found: ID does not exist" containerID="851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.972232 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72"} err="failed to get container status \"851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72\": rpc error: code = NotFound desc = could not find container \"851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72\": container with ID starting with 851e2d7728c48303e1ba14bfef23b5fdd93c93eea7aac7f51e296b601ed08a72 not found: ID does not exist" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.995590 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e712659-18f0-47eb-ba81-7b09b45677b5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:01 crc kubenswrapper[4799]: I0319 20:08:01.996140 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e712659-18f0-47eb-ba81-7b09b45677b5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.060108 4799 ???:1] "http: TLS handshake error from 192.168.126.11:51578: no serving certificate available for the kubelet" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.083374 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2dks"] Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.109886 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e712659-18f0-47eb-ba81-7b09b45677b5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.109954 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e712659-18f0-47eb-ba81-7b09b45677b5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.110432 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e712659-18f0-47eb-ba81-7b09b45677b5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.140354 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e712659-18f0-47eb-ba81-7b09b45677b5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.283423 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.404938 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.445081 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbgk7"] Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.521949 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lds6j\" (UniqueName: \"kubernetes.io/projected/75926aed-864d-42ee-aabf-89e5579606a7-kube-api-access-lds6j\") pod \"75926aed-864d-42ee-aabf-89e5579606a7\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.522410 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-config\") pod \"75926aed-864d-42ee-aabf-89e5579606a7\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.522548 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-client-ca\") pod \"75926aed-864d-42ee-aabf-89e5579606a7\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.522578 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75926aed-864d-42ee-aabf-89e5579606a7-serving-cert\") pod \"75926aed-864d-42ee-aabf-89e5579606a7\" (UID: \"75926aed-864d-42ee-aabf-89e5579606a7\") " Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.524984 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "75926aed-864d-42ee-aabf-89e5579606a7" (UID: "75926aed-864d-42ee-aabf-89e5579606a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.525237 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-config" (OuterVolumeSpecName: "config") pod "75926aed-864d-42ee-aabf-89e5579606a7" (UID: "75926aed-864d-42ee-aabf-89e5579606a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.533414 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75926aed-864d-42ee-aabf-89e5579606a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "75926aed-864d-42ee-aabf-89e5579606a7" (UID: "75926aed-864d-42ee-aabf-89e5579606a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.534122 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75926aed-864d-42ee-aabf-89e5579606a7-kube-api-access-lds6j" (OuterVolumeSpecName: "kube-api-access-lds6j") pod "75926aed-864d-42ee-aabf-89e5579606a7" (UID: "75926aed-864d-42ee-aabf-89e5579606a7"). InnerVolumeSpecName "kube-api-access-lds6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.555146 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 20:08:02 crc kubenswrapper[4799]: E0319 20:08:02.555478 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75926aed-864d-42ee-aabf-89e5579606a7" containerName="route-controller-manager" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.555500 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="75926aed-864d-42ee-aabf-89e5579606a7" containerName="route-controller-manager" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.555619 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="75926aed-864d-42ee-aabf-89e5579606a7" containerName="route-controller-manager" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.556013 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.556064 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.560767 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.561263 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.624458 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/890e0f2b-118e-4589-8b8a-9901bf32e00f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.624580 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/890e0f2b-118e-4589-8b8a-9901bf32e00f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.624632 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lds6j\" (UniqueName: \"kubernetes.io/projected/75926aed-864d-42ee-aabf-89e5579606a7-kube-api-access-lds6j\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.624646 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.624713 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75926aed-864d-42ee-aabf-89e5579606a7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.624725 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75926aed-864d-42ee-aabf-89e5579606a7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.627118 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5qgk"] Mar 19 20:08:02 crc kubenswrapper[4799]: W0319 20:08:02.692689 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9837fdbc_dcc3_4ba4_a168_a5ec29496871.slice/crio-bfc90906ba0d9b38c6b63c04dfb37972918c55fd6fff4346fc0c8315b26f0694 WatchSource:0}: Error finding container bfc90906ba0d9b38c6b63c04dfb37972918c55fd6fff4346fc0c8315b26f0694: Status 404 returned error can't find the container with id bfc90906ba0d9b38c6b63c04dfb37972918c55fd6fff4346fc0c8315b26f0694 Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.726040 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/890e0f2b-118e-4589-8b8a-9901bf32e00f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.726083 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/890e0f2b-118e-4589-8b8a-9901bf32e00f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.726129 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/890e0f2b-118e-4589-8b8a-9901bf32e00f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.744275 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.766953 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/890e0f2b-118e-4589-8b8a-9901bf32e00f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.962949 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.968405 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:02 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:02 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:02 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.968478 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.994931 4799 generic.go:334] "Generic (PLEG): container finished" podID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerID="b770c88b279e1812d107440ff3cda4f6a9de11ed9bb8d61247a2d2ef0ec9ab50" exitCode=0 Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.995011 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerDied","Data":"b770c88b279e1812d107440ff3cda4f6a9de11ed9bb8d61247a2d2ef0ec9ab50"} Mar 19 20:08:02 crc kubenswrapper[4799]: I0319 20:08:02.995075 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerStarted","Data":"706ec47fbc9175166d46dab2382c0256e641f7d9aa4ca1420fca117556119fd2"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.003485 4799 generic.go:334] "Generic (PLEG): container finished" podID="af016f47-a261-463b-98e4-710fdb4557bb" containerID="2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8" exitCode=0 Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.003590 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2587j" event={"ID":"af016f47-a261-463b-98e4-710fdb4557bb","Type":"ContainerDied","Data":"2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.013065 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" event={"ID":"f4f4875b-2470-43e2-aa8a-ae48871e0ca9","Type":"ContainerStarted","Data":"9b01aa447794753c58cfbc9b416eb1515d077b6a64a45d0138fb1d1bc898278c"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.013118 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" event={"ID":"f4f4875b-2470-43e2-aa8a-ae48871e0ca9","Type":"ContainerStarted","Data":"9a3d95bc26d055e2de59859bf5ada1f33ba76bd8fe28d082364c8a8b4aa4a5d9"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.014337 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.030366 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerStarted","Data":"bfc90906ba0d9b38c6b63c04dfb37972918c55fd6fff4346fc0c8315b26f0694"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.063181 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7e712659-18f0-47eb-ba81-7b09b45677b5","Type":"ContainerStarted","Data":"0283f6a8295a90cedbde5b02826405b0039b3ade1db979d18033e469afe876c6"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.153522 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" podStartSLOduration=97.153502709 podStartE2EDuration="1m37.153502709s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:03.045100478 +0000 UTC m=+160.651053560" watchObservedRunningTime="2026-03-19 20:08:03.153502709 +0000 UTC m=+160.759455781" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.198358 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.199217 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc1241a-6888-4bcc-b784-474bc8865f63" path="/var/lib/kubelet/pods/fcc1241a-6888-4bcc-b784-474bc8865f63/volumes" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.199913 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" event={"ID":"38afdf16-38e4-47b5-b3d7-aa040962429d","Type":"ContainerStarted","Data":"766daa78882a93ac07f229f60e1f471ee2cfb8876f894388623f1749aaceb904"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.203115 4799 generic.go:334] "Generic (PLEG): container finished" podID="75926aed-864d-42ee-aabf-89e5579606a7" containerID="28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3" exitCode=0 Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.203169 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" event={"ID":"75926aed-864d-42ee-aabf-89e5579606a7","Type":"ContainerDied","Data":"28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.203198 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" event={"ID":"75926aed-864d-42ee-aabf-89e5579606a7","Type":"ContainerDied","Data":"b0d9dcebf6dec16613056a2a655a4cc33df7f6438b9deea6689980bbb526bd9a"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.203217 4799 scope.go:117] "RemoveContainer" containerID="28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.203318 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.238329 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sndjj" event={"ID":"21434d03-6102-44f2-bb92-e1cb4efc47f9","Type":"ContainerStarted","Data":"f6cae844cdee344e6f14c6c2d15e64c7009eb0ad01c5f140726ac94e5e0b40a7"} Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.337436 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sndjj" podStartSLOduration=97.337412909 podStartE2EDuration="1m37.337412909s" podCreationTimestamp="2026-03-19 20:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:03.31758526 +0000 UTC m=+160.923538332" watchObservedRunningTime="2026-03-19 20:08:03.337412909 +0000 UTC m=+160.943365981" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.355731 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ttwrh" podStartSLOduration=12.355718083 podStartE2EDuration="12.355718083s" podCreationTimestamp="2026-03-19 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:03.355241232 +0000 UTC m=+160.961194304" watchObservedRunningTime="2026-03-19 20:08:03.355718083 +0000 UTC m=+160.961671155" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.390776 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.392549 4799 scope.go:117] "RemoveContainer" containerID="28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3" Mar 19 20:08:03 crc kubenswrapper[4799]: E0319 20:08:03.395794 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3\": container with ID starting with 28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3 not found: ID does not exist" containerID="28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.395826 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3"} err="failed to get container status \"28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3\": rpc error: code = NotFound desc = could not find container \"28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3\": container with ID starting with 28f06d09388f4f44c2d169c6c46c8d09e953ab001460f328899206ae94abd7e3 not found: ID does not exist" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.406292 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nzcwj"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.420487 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.421257 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.430092 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77dcb595b7-jv6pv"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.430786 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.438593 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439343 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439594 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c96b18-7050-4f09-b776-93b6a5c62eed-serving-cert\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439622 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8wf\" (UniqueName: \"kubernetes.io/projected/f2c96b18-7050-4f09-b776-93b6a5c62eed-kube-api-access-7p8wf\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439643 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-config\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439659 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-client-ca\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439682 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-proxy-ca-bundles\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439696 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2cnm\" (UniqueName: \"kubernetes.io/projected/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-kube-api-access-w2cnm\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439712 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-client-ca\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439753 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-config\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.439778 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-serving-cert\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.440060 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.440093 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.440182 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.440304 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.441009 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.441122 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.441226 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.441337 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.441481 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.441607 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.448410 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.457910 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.475970 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dcb595b7-jv6pv"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.548999 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8wf\" (UniqueName: \"kubernetes.io/projected/f2c96b18-7050-4f09-b776-93b6a5c62eed-kube-api-access-7p8wf\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549564 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-config\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549582 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-client-ca\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549607 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-proxy-ca-bundles\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549622 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2cnm\" (UniqueName: \"kubernetes.io/projected/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-kube-api-access-w2cnm\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549637 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-client-ca\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549674 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-config\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549700 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-serving-cert\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.549728 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c96b18-7050-4f09-b776-93b6a5c62eed-serving-cert\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.551249 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-config\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.551807 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-client-ca\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.552335 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-client-ca\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.552802 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-config\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.553026 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-proxy-ca-bundles\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.562193 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-serving-cert\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.565044 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c96b18-7050-4f09-b776-93b6a5c62eed-serving-cert\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.571961 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2cnm\" (UniqueName: \"kubernetes.io/projected/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-kube-api-access-w2cnm\") pod \"route-controller-manager-7f97766657-rk9j5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.585844 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8wf\" (UniqueName: \"kubernetes.io/projected/f2c96b18-7050-4f09-b776-93b6a5c62eed-kube-api-access-7p8wf\") pod \"controller-manager-77dcb595b7-jv6pv\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.638223 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.750720 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.751900 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8720bb6-e6ea-43b3-a750-d0b7c1221266-secret-volume\") pod \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.751983 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8720bb6-e6ea-43b3-a750-d0b7c1221266-config-volume\") pod \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.752125 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5lcf\" (UniqueName: \"kubernetes.io/projected/f8720bb6-e6ea-43b3-a750-d0b7c1221266-kube-api-access-c5lcf\") pod \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\" (UID: \"f8720bb6-e6ea-43b3-a750-d0b7c1221266\") " Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.753100 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8720bb6-e6ea-43b3-a750-d0b7c1221266-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8720bb6-e6ea-43b3-a750-d0b7c1221266" (UID: "f8720bb6-e6ea-43b3-a750-d0b7c1221266"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.754998 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8720bb6-e6ea-43b3-a750-d0b7c1221266-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f8720bb6-e6ea-43b3-a750-d0b7c1221266" (UID: "f8720bb6-e6ea-43b3-a750-d0b7c1221266"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.755892 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8720bb6-e6ea-43b3-a750-d0b7c1221266-kube-api-access-c5lcf" (OuterVolumeSpecName: "kube-api-access-c5lcf") pod "f8720bb6-e6ea-43b3-a750-d0b7c1221266" (UID: "f8720bb6-e6ea-43b3-a750-d0b7c1221266"). InnerVolumeSpecName "kube-api-access-c5lcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.760930 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.832402 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.854301 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5lcf\" (UniqueName: \"kubernetes.io/projected/f8720bb6-e6ea-43b3-a750-d0b7c1221266-kube-api-access-c5lcf\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.854332 4799 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8720bb6-e6ea-43b3-a750-d0b7c1221266-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.854347 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8720bb6-e6ea-43b3-a750-d0b7c1221266-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:03 crc kubenswrapper[4799]: W0319 20:08:03.876823 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod890e0f2b_118e_4589_8b8a_9901bf32e00f.slice/crio-ebc7a0bc21937a2e55b17d112ddcdd49e42f19be77d13d2e709be9f86d02abcc WatchSource:0}: Error finding container ebc7a0bc21937a2e55b17d112ddcdd49e42f19be77d13d2e709be9f86d02abcc: Status 404 returned error can't find the container with id ebc7a0bc21937a2e55b17d112ddcdd49e42f19be77d13d2e709be9f86d02abcc Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.919904 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.919939 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.921954 4799 patch_prober.go:28] interesting pod/console-f9d7485db-lq2lw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.922038 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lq2lw" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.963215 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.968035 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:03 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:03 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:03 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:03 crc kubenswrapper[4799]: I0319 20:08:03.968081 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.227806 4799 patch_prober.go:28] interesting pod/downloads-7954f5f757-49fwh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.228103 4799 patch_prober.go:28] interesting pod/downloads-7954f5f757-49fwh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.228148 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-49fwh" podUID="e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.228207 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-49fwh" podUID="e1aa0e72-b1d4-4e90-9a9c-9283c47d7a9d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.266920 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"890e0f2b-118e-4589-8b8a-9901bf32e00f","Type":"ContainerStarted","Data":"ebc7a0bc21937a2e55b17d112ddcdd49e42f19be77d13d2e709be9f86d02abcc"} Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.269825 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" event={"ID":"f8720bb6-e6ea-43b3-a750-d0b7c1221266","Type":"ContainerDied","Data":"0e8e2f54ab2b3db9a69b7da0dd50c59699a03c28543c57776c4663f630dca0ff"} Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.269932 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8e2f54ab2b3db9a69b7da0dd50c59699a03c28543c57776c4663f630dca0ff" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.269877 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.275621 4799 generic.go:334] "Generic (PLEG): container finished" podID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerID="ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083" exitCode=0 Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.275694 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerDied","Data":"ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083"} Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.282746 4799 generic.go:334] "Generic (PLEG): container finished" podID="7e712659-18f0-47eb-ba81-7b09b45677b5" containerID="10003ef4c4bfa69f7967c83d35a17f4dab979a66ff5875d08e67192fc2e064db" exitCode=0 Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.282864 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7e712659-18f0-47eb-ba81-7b09b45677b5","Type":"ContainerDied","Data":"10003ef4c4bfa69f7967c83d35a17f4dab979a66ff5875d08e67192fc2e064db"} Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.342761 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dcb595b7-jv6pv"] Mar 19 20:08:04 crc kubenswrapper[4799]: W0319 20:08:04.358070 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c96b18_7050_4f09_b776_93b6a5c62eed.slice/crio-477bddeb4f0426fb0f2f811d8f9315384958080e2c32230c40586fd96c44f602 WatchSource:0}: Error finding container 477bddeb4f0426fb0f2f811d8f9315384958080e2c32230c40586fd96c44f602: Status 404 returned error can't find the container with id 477bddeb4f0426fb0f2f811d8f9315384958080e2c32230c40586fd96c44f602 Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.435176 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5"] Mar 19 20:08:04 crc kubenswrapper[4799]: E0319 20:08:04.444053 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:04 crc kubenswrapper[4799]: E0319 20:08:04.446927 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:04 crc kubenswrapper[4799]: W0319 20:08:04.456408 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3d929b_cf45_4713_8ecc_cabc98bb6dc5.slice/crio-f45439a99abc1ef5bb98875b14022c24a602e9196c2495f5012fd8b5157d4501 WatchSource:0}: Error finding container f45439a99abc1ef5bb98875b14022c24a602e9196c2495f5012fd8b5157d4501: Status 404 returned error can't find the container with id f45439a99abc1ef5bb98875b14022c24a602e9196c2495f5012fd8b5157d4501 Mar 19 20:08:04 crc kubenswrapper[4799]: E0319 20:08:04.469975 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:04 crc kubenswrapper[4799]: E0319 20:08:04.470028 4799 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.645134 4799 ???:1] "http: TLS handshake error from 192.168.126.11:51590: no serving certificate available for the kubelet" Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.973923 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:04 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:04 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:04 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:04 crc kubenswrapper[4799]: I0319 20:08:04.974033 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.118068 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:08:05 crc kubenswrapper[4799]: E0319 20:08:05.118563 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.142174 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75926aed-864d-42ee-aabf-89e5579606a7" path="/var/lib/kubelet/pods/75926aed-864d-42ee-aabf-89e5579606a7/volumes" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.301030 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" event={"ID":"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5","Type":"ContainerStarted","Data":"b94ddb2e4a9adb827c1417ee8f4401d99b961755bdaa4693c665207d1906c864"} Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.301091 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" event={"ID":"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5","Type":"ContainerStarted","Data":"f45439a99abc1ef5bb98875b14022c24a602e9196c2495f5012fd8b5157d4501"} Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.302404 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.306794 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"890e0f2b-118e-4589-8b8a-9901bf32e00f","Type":"ContainerStarted","Data":"92cc8dc97b75594b4acb84c7bcbe1b47ecbbf2a572874eafb34d8b827a33cc2c"} Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.326769 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" event={"ID":"f2c96b18-7050-4f09-b776-93b6a5c62eed","Type":"ContainerStarted","Data":"b67fd6bafebeb20c71b0fa86e2cc11cdb90d196c5176ec474ed4b23c08ed51b2"} Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.326814 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" event={"ID":"f2c96b18-7050-4f09-b776-93b6a5c62eed","Type":"ContainerStarted","Data":"477bddeb4f0426fb0f2f811d8f9315384958080e2c32230c40586fd96c44f602"} Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.327086 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.342837 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.362849 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.362810333 podStartE2EDuration="3.362810333s" podCreationTimestamp="2026-03-19 20:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:05.359327702 +0000 UTC m=+162.965280774" watchObservedRunningTime="2026-03-19 20:08:05.362810333 +0000 UTC m=+162.968763405" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.364936 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" podStartSLOduration=4.364925382 podStartE2EDuration="4.364925382s" podCreationTimestamp="2026-03-19 20:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:05.340464435 +0000 UTC m=+162.946417507" watchObservedRunningTime="2026-03-19 20:08:05.364925382 +0000 UTC m=+162.970878454" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.387306 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" podStartSLOduration=4.38728981 podStartE2EDuration="4.38728981s" podCreationTimestamp="2026-03-19 20:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:05.383987603 +0000 UTC m=+162.989940675" watchObservedRunningTime="2026-03-19 20:08:05.38728981 +0000 UTC m=+162.993242882" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.496518 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.509034 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.528616 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.609548 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.967190 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:05 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:05 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:05 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.967348 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:05 crc kubenswrapper[4799]: I0319 20:08:05.996072 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.091422 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e712659-18f0-47eb-ba81-7b09b45677b5-kube-api-access\") pod \"7e712659-18f0-47eb-ba81-7b09b45677b5\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.091492 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e712659-18f0-47eb-ba81-7b09b45677b5-kubelet-dir\") pod \"7e712659-18f0-47eb-ba81-7b09b45677b5\" (UID: \"7e712659-18f0-47eb-ba81-7b09b45677b5\") " Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.092049 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e712659-18f0-47eb-ba81-7b09b45677b5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7e712659-18f0-47eb-ba81-7b09b45677b5" (UID: "7e712659-18f0-47eb-ba81-7b09b45677b5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.102597 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e712659-18f0-47eb-ba81-7b09b45677b5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7e712659-18f0-47eb-ba81-7b09b45677b5" (UID: "7e712659-18f0-47eb-ba81-7b09b45677b5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.193752 4799 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e712659-18f0-47eb-ba81-7b09b45677b5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.193787 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e712659-18f0-47eb-ba81-7b09b45677b5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.345193 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.346420 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7e712659-18f0-47eb-ba81-7b09b45677b5","Type":"ContainerDied","Data":"0283f6a8295a90cedbde5b02826405b0039b3ade1db979d18033e469afe876c6"} Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.346458 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0283f6a8295a90cedbde5b02826405b0039b3ade1db979d18033e469afe876c6" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.351188 4799 generic.go:334] "Generic (PLEG): container finished" podID="890e0f2b-118e-4589-8b8a-9901bf32e00f" containerID="92cc8dc97b75594b4acb84c7bcbe1b47ecbbf2a572874eafb34d8b827a33cc2c" exitCode=0 Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.352295 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"890e0f2b-118e-4589-8b8a-9901bf32e00f","Type":"ContainerDied","Data":"92cc8dc97b75594b4acb84c7bcbe1b47ecbbf2a572874eafb34d8b827a33cc2c"} Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.356568 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nthhp" Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.965181 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:06 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:06 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:06 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:06 crc kubenswrapper[4799]: I0319 20:08:06.965243 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.361560 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gksq8" Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.802367 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.929610 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/890e0f2b-118e-4589-8b8a-9901bf32e00f-kube-api-access\") pod \"890e0f2b-118e-4589-8b8a-9901bf32e00f\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.930120 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/890e0f2b-118e-4589-8b8a-9901bf32e00f-kubelet-dir\") pod \"890e0f2b-118e-4589-8b8a-9901bf32e00f\" (UID: \"890e0f2b-118e-4589-8b8a-9901bf32e00f\") " Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.930197 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/890e0f2b-118e-4589-8b8a-9901bf32e00f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "890e0f2b-118e-4589-8b8a-9901bf32e00f" (UID: "890e0f2b-118e-4589-8b8a-9901bf32e00f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.937771 4799 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/890e0f2b-118e-4589-8b8a-9901bf32e00f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.954582 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890e0f2b-118e-4589-8b8a-9901bf32e00f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "890e0f2b-118e-4589-8b8a-9901bf32e00f" (UID: "890e0f2b-118e-4589-8b8a-9901bf32e00f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.965720 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:07 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:07 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:07 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:07 crc kubenswrapper[4799]: I0319 20:08:07.965776 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.038981 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/890e0f2b-118e-4589-8b8a-9901bf32e00f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.382860 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.382981 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"890e0f2b-118e-4589-8b8a-9901bf32e00f","Type":"ContainerDied","Data":"ebc7a0bc21937a2e55b17d112ddcdd49e42f19be77d13d2e709be9f86d02abcc"} Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.383015 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebc7a0bc21937a2e55b17d112ddcdd49e42f19be77d13d2e709be9f86d02abcc" Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.618565 4799 ???:1] "http: TLS handshake error from 192.168.126.11:51596: no serving certificate available for the kubelet" Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.965917 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:08 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:08 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:08 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:08 crc kubenswrapper[4799]: I0319 20:08:08.966185 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:09 crc kubenswrapper[4799]: I0319 20:08:09.791575 4799 ???:1] "http: TLS handshake error from 192.168.126.11:51606: no serving certificate available for the kubelet" Mar 19 20:08:09 crc kubenswrapper[4799]: I0319 20:08:09.965300 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:09 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:09 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:09 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:09 crc kubenswrapper[4799]: I0319 20:08:09.965363 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:10 crc kubenswrapper[4799]: I0319 20:08:10.965170 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:10 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:10 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:10 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:10 crc kubenswrapper[4799]: I0319 20:08:10.965227 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:11 crc kubenswrapper[4799]: I0319 20:08:11.964838 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:11 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:11 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:11 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:11 crc kubenswrapper[4799]: I0319 20:08:11.965262 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:12 crc kubenswrapper[4799]: I0319 20:08:12.124919 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 20:08:12 crc kubenswrapper[4799]: I0319 20:08:12.965012 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:12 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:12 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:12 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:12 crc kubenswrapper[4799]: I0319 20:08:12.965305 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:13 crc kubenswrapper[4799]: I0319 20:08:13.142098 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.142058183 podStartE2EDuration="1.142058183s" podCreationTimestamp="2026-03-19 20:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:13.13590336 +0000 UTC m=+170.741856432" watchObservedRunningTime="2026-03-19 20:08:13.142058183 +0000 UTC m=+170.748011255" Mar 19 20:08:13 crc kubenswrapper[4799]: I0319 20:08:13.919439 4799 patch_prober.go:28] interesting pod/console-f9d7485db-lq2lw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 19 20:08:13 crc kubenswrapper[4799]: I0319 20:08:13.919489 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lq2lw" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 19 20:08:13 crc kubenswrapper[4799]: I0319 20:08:13.964732 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:13 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:13 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:13 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:13 crc kubenswrapper[4799]: I0319 20:08:13.964788 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:14 crc kubenswrapper[4799]: I0319 20:08:14.233490 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-49fwh" Mar 19 20:08:14 crc kubenswrapper[4799]: E0319 20:08:14.433561 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:14 crc kubenswrapper[4799]: E0319 20:08:14.435128 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:14 crc kubenswrapper[4799]: E0319 20:08:14.439730 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:14 crc kubenswrapper[4799]: E0319 20:08:14.439768 4799 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:14 crc kubenswrapper[4799]: I0319 20:08:14.965283 4799 patch_prober.go:28] interesting pod/router-default-5444994796-zxk4t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:08:14 crc kubenswrapper[4799]: [-]has-synced failed: reason withheld Mar 19 20:08:14 crc kubenswrapper[4799]: [+]process-running ok Mar 19 20:08:14 crc kubenswrapper[4799]: healthz check failed Mar 19 20:08:14 crc kubenswrapper[4799]: I0319 20:08:14.965370 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zxk4t" podUID="ae84a622-64a2-408d-9275-b3ece41df758" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:08:15 crc kubenswrapper[4799]: I0319 20:08:15.966691 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:08:15 crc kubenswrapper[4799]: I0319 20:08:15.969824 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zxk4t" Mar 19 20:08:16 crc kubenswrapper[4799]: I0319 20:08:16.117023 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:08:16 crc kubenswrapper[4799]: E0319 20:08:16.117347 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.143111 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77dcb595b7-jv6pv"] Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.143840 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerName="controller-manager" containerID="cri-o://b67fd6bafebeb20c71b0fa86e2cc11cdb90d196c5176ec474ed4b23c08ed51b2" gracePeriod=30 Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.178336 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5"] Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.179132 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" podUID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" containerName="route-controller-manager" containerID="cri-o://b94ddb2e4a9adb827c1417ee8f4401d99b961755bdaa4693c665207d1906c864" gracePeriod=30 Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.485881 4799 generic.go:334] "Generic (PLEG): container finished" podID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerID="b67fd6bafebeb20c71b0fa86e2cc11cdb90d196c5176ec474ed4b23c08ed51b2" exitCode=0 Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.485965 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" event={"ID":"f2c96b18-7050-4f09-b776-93b6a5c62eed","Type":"ContainerDied","Data":"b67fd6bafebeb20c71b0fa86e2cc11cdb90d196c5176ec474ed4b23c08ed51b2"} Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.488111 4799 generic.go:334] "Generic (PLEG): container finished" podID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" containerID="b94ddb2e4a9adb827c1417ee8f4401d99b961755bdaa4693c665207d1906c864" exitCode=0 Mar 19 20:08:19 crc kubenswrapper[4799]: I0319 20:08:19.488171 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" event={"ID":"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5","Type":"ContainerDied","Data":"b94ddb2e4a9adb827c1417ee8f4401d99b961755bdaa4693c665207d1906c864"} Mar 19 20:08:21 crc kubenswrapper[4799]: I0319 20:08:21.653626 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:08:23 crc kubenswrapper[4799]: I0319 20:08:23.767965 4799 patch_prober.go:28] interesting pod/controller-manager-77dcb595b7-jv6pv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 19 20:08:23 crc kubenswrapper[4799]: I0319 20:08:23.768701 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 19 20:08:23 crc kubenswrapper[4799]: I0319 20:08:23.933443 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:08:23 crc kubenswrapper[4799]: I0319 20:08:23.937960 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:08:24 crc kubenswrapper[4799]: E0319 20:08:24.428760 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:24 crc kubenswrapper[4799]: E0319 20:08:24.430483 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:24 crc kubenswrapper[4799]: E0319 20:08:24.431520 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:24 crc kubenswrapper[4799]: E0319 20:08:24.431589 4799 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:24 crc kubenswrapper[4799]: I0319 20:08:24.752461 4799 patch_prober.go:28] interesting pod/route-controller-manager-7f97766657-rk9j5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:08:24 crc kubenswrapper[4799]: I0319 20:08:24.752529 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" podUID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:08:29 crc kubenswrapper[4799]: I0319 20:08:29.116744 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:08:29 crc kubenswrapper[4799]: E0319 20:08:29.117876 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 20:08:29 crc kubenswrapper[4799]: I0319 20:08:29.579352 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-dzmpr_a06124f0-b975-4b73-b58d-f678af8cda26/kube-multus-additional-cni-plugins/0.log" Mar 19 20:08:29 crc kubenswrapper[4799]: I0319 20:08:29.579612 4799 generic.go:334] "Generic (PLEG): container finished" podID="a06124f0-b975-4b73-b58d-f678af8cda26" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" exitCode=137 Mar 19 20:08:29 crc kubenswrapper[4799]: I0319 20:08:29.579643 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" event={"ID":"a06124f0-b975-4b73-b58d-f678af8cda26","Type":"ContainerDied","Data":"fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328"} Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.293519 4799 ???:1] "http: TLS handshake error from 192.168.126.11:38646: no serving certificate available for the kubelet" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.950054 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.988324 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff"] Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.989235 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890e0f2b-118e-4589-8b8a-9901bf32e00f" containerName="pruner" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.989253 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="890e0f2b-118e-4589-8b8a-9901bf32e00f" containerName="pruner" Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.989270 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" containerName="route-controller-manager" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.989280 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" containerName="route-controller-manager" Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.989295 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8720bb6-e6ea-43b3-a750-d0b7c1221266" containerName="collect-profiles" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.989305 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8720bb6-e6ea-43b3-a750-d0b7c1221266" containerName="collect-profiles" Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.989317 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e712659-18f0-47eb-ba81-7b09b45677b5" containerName="pruner" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.989324 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e712659-18f0-47eb-ba81-7b09b45677b5" containerName="pruner" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.990842 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e712659-18f0-47eb-ba81-7b09b45677b5" containerName="pruner" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.990872 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8720bb6-e6ea-43b3-a750-d0b7c1221266" containerName="collect-profiles" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.990882 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="890e0f2b-118e-4589-8b8a-9901bf32e00f" containerName="pruner" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.992424 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" containerName="route-controller-manager" Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.993126 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.993488 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.993448 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfj45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2gp4f_openshift-marketplace(03f0bc9a-1283-46a4-a6df-05d8ee9c3c40): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:30 crc kubenswrapper[4799]: I0319 20:08:30.994543 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff"] Mar 19 20:08:30 crc kubenswrapper[4799]: E0319 20:08:30.994745 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2gp4f" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" Mar 19 20:08:31 crc kubenswrapper[4799]: E0319 20:08:31.017546 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 20:08:31 crc kubenswrapper[4799]: E0319 20:08:31.017701 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-97ks6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2587j_openshift-marketplace(af016f47-a261-463b-98e4-710fdb4557bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:31 crc kubenswrapper[4799]: E0319 20:08:31.018875 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2587j" podUID="af016f47-a261-463b-98e4-710fdb4557bb" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.085275 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-client-ca\") pod \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.085360 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2cnm\" (UniqueName: \"kubernetes.io/projected/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-kube-api-access-w2cnm\") pod \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.085412 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-config\") pod \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.085439 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-serving-cert\") pod \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\" (UID: \"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5\") " Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.086261 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" (UID: "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.087254 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-config" (OuterVolumeSpecName: "config") pod "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" (UID: "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.093507 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-kube-api-access-w2cnm" (OuterVolumeSpecName: "kube-api-access-w2cnm") pod "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" (UID: "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5"). InnerVolumeSpecName "kube-api-access-w2cnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.100284 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" (UID: "1b3d929b-cf45-4713-8ecc-cabc98bb6dc5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187104 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-client-ca\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187153 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-config\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187233 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbae66c-d2a0-4942-8cdd-60ef4b413765-serving-cert\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187320 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t84dw\" (UniqueName: \"kubernetes.io/projected/0cbae66c-d2a0-4942-8cdd-60ef4b413765-kube-api-access-t84dw\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187620 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187657 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2cnm\" (UniqueName: \"kubernetes.io/projected/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-kube-api-access-w2cnm\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187673 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.187682 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.288645 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-client-ca\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.288682 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-config\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.288708 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbae66c-d2a0-4942-8cdd-60ef4b413765-serving-cert\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.288725 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t84dw\" (UniqueName: \"kubernetes.io/projected/0cbae66c-d2a0-4942-8cdd-60ef4b413765-kube-api-access-t84dw\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.289789 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-client-ca\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.290578 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-config\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.301567 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbae66c-d2a0-4942-8cdd-60ef4b413765-serving-cert\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.304992 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t84dw\" (UniqueName: \"kubernetes.io/projected/0cbae66c-d2a0-4942-8cdd-60ef4b413765-kube-api-access-t84dw\") pod \"route-controller-manager-55ff6c6977-v9wff\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.314759 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.589493 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" event={"ID":"1b3d929b-cf45-4713-8ecc-cabc98bb6dc5","Type":"ContainerDied","Data":"f45439a99abc1ef5bb98875b14022c24a602e9196c2495f5012fd8b5157d4501"} Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.589557 4799 scope.go:117] "RemoveContainer" containerID="b94ddb2e4a9adb827c1417ee8f4401d99b961755bdaa4693c665207d1906c864" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.589727 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5" Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.618555 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5"] Mar 19 20:08:31 crc kubenswrapper[4799]: I0319 20:08:31.621665 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f97766657-rk9j5"] Mar 19 20:08:32 crc kubenswrapper[4799]: E0319 20:08:32.548251 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2gp4f" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" Mar 19 20:08:32 crc kubenswrapper[4799]: E0319 20:08:32.548255 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2587j" podUID="af016f47-a261-463b-98e4-710fdb4557bb" Mar 19 20:08:32 crc kubenswrapper[4799]: E0319 20:08:32.654423 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 20:08:32 crc kubenswrapper[4799]: E0319 20:08:32.654592 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wtwfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lf2lf_openshift-marketplace(0dfa7f99-f171-41a9-8931-07caaeaa06e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:32 crc kubenswrapper[4799]: E0319 20:08:32.655770 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lf2lf" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" Mar 19 20:08:33 crc kubenswrapper[4799]: I0319 20:08:33.125041 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3d929b-cf45-4713-8ecc-cabc98bb6dc5" path="/var/lib/kubelet/pods/1b3d929b-cf45-4713-8ecc-cabc98bb6dc5/volumes" Mar 19 20:08:33 crc kubenswrapper[4799]: E0319 20:08:33.418125 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 20:08:33 crc kubenswrapper[4799]: E0319 20:08:33.418491 4799 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 20:08:33 crc kubenswrapper[4799]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 20:08:33 crc kubenswrapper[4799]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fw98s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565848-5gcs8_openshift-infra(31aa7077-55c5-426b-a92f-c93b8d767105): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 20:08:33 crc kubenswrapper[4799]: > logger="UnhandledError" Mar 19 20:08:33 crc kubenswrapper[4799]: E0319 20:08:33.419692 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" podUID="31aa7077-55c5-426b-a92f-c93b8d767105" Mar 19 20:08:33 crc kubenswrapper[4799]: E0319 20:08:33.603434 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" podUID="31aa7077-55c5-426b-a92f-c93b8d767105" Mar 19 20:08:34 crc kubenswrapper[4799]: E0319 20:08:34.425729 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328 is running failed: container process not found" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:34 crc kubenswrapper[4799]: E0319 20:08:34.426500 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328 is running failed: container process not found" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:34 crc kubenswrapper[4799]: E0319 20:08:34.427154 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328 is running failed: container process not found" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 20:08:34 crc kubenswrapper[4799]: E0319 20:08:34.427211 4799 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:34 crc kubenswrapper[4799]: I0319 20:08:34.762494 4799 patch_prober.go:28] interesting pod/controller-manager-77dcb595b7-jv6pv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:08:34 crc kubenswrapper[4799]: I0319 20:08:34.762571 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:08:35 crc kubenswrapper[4799]: I0319 20:08:35.001926 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f7jjj" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.739565 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.740958 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.742614 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.743180 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.746732 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.853823 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.853913 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.955528 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.955607 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.955693 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:36 crc kubenswrapper[4799]: I0319 20:08:36.975442 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.065086 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.309256 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lf2lf" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.407319 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.407454 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wdqml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q5qgk_openshift-marketplace(9837fdbc-dcc3-4ba4-a168-a5ec29496871): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.407768 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.409365 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q5qgk" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.410465 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-dzmpr_a06124f0-b975-4b73-b58d-f678af8cda26/kube-multus-additional-cni-plugins/0.log" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.410511 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.414629 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.414742 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdxbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sq5wd_openshift-marketplace(9365ab34-43df-42ee-bc04-baeba5579717): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.415896 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sq5wd" podUID="9365ab34-43df-42ee-bc04-baeba5579717" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.438059 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh"] Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.438338 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.438351 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.438369 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerName="controller-manager" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.438421 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerName="controller-manager" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.438549 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" containerName="kube-multus-additional-cni-plugins" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.438574 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" containerName="controller-manager" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.439073 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.446345 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh"] Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.463877 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist\") pod \"a06124f0-b975-4b73-b58d-f678af8cda26\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.463940 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a06124f0-b975-4b73-b58d-f678af8cda26-ready\") pod \"a06124f0-b975-4b73-b58d-f678af8cda26\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.463962 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-proxy-ca-bundles\") pod \"f2c96b18-7050-4f09-b776-93b6a5c62eed\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.463979 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-client-ca\") pod \"f2c96b18-7050-4f09-b776-93b6a5c62eed\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.463993 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-config\") pod \"f2c96b18-7050-4f09-b776-93b6a5c62eed\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.464069 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c96b18-7050-4f09-b776-93b6a5c62eed-serving-cert\") pod \"f2c96b18-7050-4f09-b776-93b6a5c62eed\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.464085 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a06124f0-b975-4b73-b58d-f678af8cda26-tuning-conf-dir\") pod \"a06124f0-b975-4b73-b58d-f678af8cda26\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.464124 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p8wf\" (UniqueName: \"kubernetes.io/projected/f2c96b18-7050-4f09-b776-93b6a5c62eed-kube-api-access-7p8wf\") pod \"f2c96b18-7050-4f09-b776-93b6a5c62eed\" (UID: \"f2c96b18-7050-4f09-b776-93b6a5c62eed\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.464143 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwbg7\" (UniqueName: \"kubernetes.io/projected/a06124f0-b975-4b73-b58d-f678af8cda26-kube-api-access-hwbg7\") pod \"a06124f0-b975-4b73-b58d-f678af8cda26\" (UID: \"a06124f0-b975-4b73-b58d-f678af8cda26\") " Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.464509 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06124f0-b975-4b73-b58d-f678af8cda26-ready" (OuterVolumeSpecName: "ready") pod "a06124f0-b975-4b73-b58d-f678af8cda26" (UID: "a06124f0-b975-4b73-b58d-f678af8cda26"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.464563 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a06124f0-b975-4b73-b58d-f678af8cda26-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "a06124f0-b975-4b73-b58d-f678af8cda26" (UID: "a06124f0-b975-4b73-b58d-f678af8cda26"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.465480 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-client-ca" (OuterVolumeSpecName: "client-ca") pod "f2c96b18-7050-4f09-b776-93b6a5c62eed" (UID: "f2c96b18-7050-4f09-b776-93b6a5c62eed"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.465507 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-config" (OuterVolumeSpecName: "config") pod "f2c96b18-7050-4f09-b776-93b6a5c62eed" (UID: "f2c96b18-7050-4f09-b776-93b6a5c62eed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.466578 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "a06124f0-b975-4b73-b58d-f678af8cda26" (UID: "a06124f0-b975-4b73-b58d-f678af8cda26"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.469980 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.470132 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5npwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-k2dks_openshift-marketplace(3b719eba-9287-4b38-9749-f5c2e09e32e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.470345 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c96b18-7050-4f09-b776-93b6a5c62eed-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f2c96b18-7050-4f09-b776-93b6a5c62eed" (UID: "f2c96b18-7050-4f09-b776-93b6a5c62eed"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.470398 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f2c96b18-7050-4f09-b776-93b6a5c62eed" (UID: "f2c96b18-7050-4f09-b776-93b6a5c62eed"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.472276 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06124f0-b975-4b73-b58d-f678af8cda26-kube-api-access-hwbg7" (OuterVolumeSpecName: "kube-api-access-hwbg7") pod "a06124f0-b975-4b73-b58d-f678af8cda26" (UID: "a06124f0-b975-4b73-b58d-f678af8cda26"). InnerVolumeSpecName "kube-api-access-hwbg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.472349 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-k2dks" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.476589 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c96b18-7050-4f09-b776-93b6a5c62eed-kube-api-access-7p8wf" (OuterVolumeSpecName: "kube-api-access-7p8wf") pod "f2c96b18-7050-4f09-b776-93b6a5c62eed" (UID: "f2c96b18-7050-4f09-b776-93b6a5c62eed"). InnerVolumeSpecName "kube-api-access-7p8wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.497939 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.498273 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zq7t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fgtxw_openshift-marketplace(4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.499466 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fgtxw" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.565835 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b554b5b-9cf3-4306-897e-2e4e93f407f3-serving-cert\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.565881 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f84wb\" (UniqueName: \"kubernetes.io/projected/0b554b5b-9cf3-4306-897e-2e4e93f407f3-kube-api-access-f84wb\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.565902 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-config\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.565941 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-proxy-ca-bundles\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.565962 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-client-ca\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566004 4799 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/a06124f0-b975-4b73-b58d-f678af8cda26-ready\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566015 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566024 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566033 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c96b18-7050-4f09-b776-93b6a5c62eed-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566060 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c96b18-7050-4f09-b776-93b6a5c62eed-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566071 4799 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a06124f0-b975-4b73-b58d-f678af8cda26-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566080 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p8wf\" (UniqueName: \"kubernetes.io/projected/f2c96b18-7050-4f09-b776-93b6a5c62eed-kube-api-access-7p8wf\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566089 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwbg7\" (UniqueName: \"kubernetes.io/projected/a06124f0-b975-4b73-b58d-f678af8cda26-kube-api-access-hwbg7\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.566098 4799 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a06124f0-b975-4b73-b58d-f678af8cda26-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.593204 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff"] Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.631928 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-dzmpr_a06124f0-b975-4b73-b58d-f678af8cda26/kube-multus-additional-cni-plugins/0.log" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.632007 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" event={"ID":"a06124f0-b975-4b73-b58d-f678af8cda26","Type":"ContainerDied","Data":"8ad8a07d013a58e12b0bf71761d83b6e505fca0effc26849212fa0044c32cad5"} Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.632036 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-dzmpr" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.632129 4799 scope.go:117] "RemoveContainer" containerID="fe29cdf9a631c8936f04a423455fc27c8e6bae3519a9b711d0e405ed36456328" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.638518 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" event={"ID":"f2c96b18-7050-4f09-b776-93b6a5c62eed","Type":"ContainerDied","Data":"477bddeb4f0426fb0f2f811d8f9315384958080e2c32230c40586fd96c44f602"} Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.638591 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dcb595b7-jv6pv" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.639681 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" event={"ID":"0cbae66c-d2a0-4942-8cdd-60ef4b413765","Type":"ContainerStarted","Data":"b0bbaa2010a0bde2a4a7d51eada8193523476219628972baaf7b48902ea9bd9b"} Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.642591 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerStarted","Data":"e1861d460b2af27d98d8f1e474d168a6ef51077309b26a7494ae75630aa450a7"} Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.651728 4799 scope.go:117] "RemoveContainer" containerID="b67fd6bafebeb20c71b0fa86e2cc11cdb90d196c5176ec474ed4b23c08ed51b2" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.659757 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fgtxw" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.659888 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-k2dks" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.659978 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q5qgk" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" Mar 19 20:08:37 crc kubenswrapper[4799]: E0319 20:08:37.660058 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sq5wd" podUID="9365ab34-43df-42ee-bc04-baeba5579717" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.667631 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-proxy-ca-bundles\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.667670 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-client-ca\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.667915 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b554b5b-9cf3-4306-897e-2e4e93f407f3-serving-cert\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.667945 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f84wb\" (UniqueName: \"kubernetes.io/projected/0b554b5b-9cf3-4306-897e-2e4e93f407f3-kube-api-access-f84wb\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.669288 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-config\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.672424 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-config\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.673835 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-proxy-ca-bundles\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.674348 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-client-ca\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.681309 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b554b5b-9cf3-4306-897e-2e4e93f407f3-serving-cert\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.702052 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f84wb\" (UniqueName: \"kubernetes.io/projected/0b554b5b-9cf3-4306-897e-2e4e93f407f3-kube-api-access-f84wb\") pod \"controller-manager-7b44b4b9f4-f8djh\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.758166 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77dcb595b7-jv6pv"] Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.760646 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77dcb595b7-jv6pv"] Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.767229 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.801306 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-dzmpr"] Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.804505 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-dzmpr"] Mar 19 20:08:37 crc kubenswrapper[4799]: I0319 20:08:37.867874 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.165574 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh"] Mar 19 20:08:38 crc kubenswrapper[4799]: W0319 20:08:38.176310 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b554b5b_9cf3_4306_897e_2e4e93f407f3.slice/crio-289f58b8b8727a2e8e1721a0c6afb3950ac6bbbece655670e884c4146841efdc WatchSource:0}: Error finding container 289f58b8b8727a2e8e1721a0c6afb3950ac6bbbece655670e884c4146841efdc: Status 404 returned error can't find the container with id 289f58b8b8727a2e8e1721a0c6afb3950ac6bbbece655670e884c4146841efdc Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.653619 4799 generic.go:334] "Generic (PLEG): container finished" podID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerID="e1861d460b2af27d98d8f1e474d168a6ef51077309b26a7494ae75630aa450a7" exitCode=0 Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.653801 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerDied","Data":"e1861d460b2af27d98d8f1e474d168a6ef51077309b26a7494ae75630aa450a7"} Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.656661 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" event={"ID":"0cbae66c-d2a0-4942-8cdd-60ef4b413765","Type":"ContainerStarted","Data":"d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c"} Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.657647 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.659634 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" event={"ID":"0b554b5b-9cf3-4306-897e-2e4e93f407f3","Type":"ContainerStarted","Data":"5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720"} Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.659662 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" event={"ID":"0b554b5b-9cf3-4306-897e-2e4e93f407f3","Type":"ContainerStarted","Data":"289f58b8b8727a2e8e1721a0c6afb3950ac6bbbece655670e884c4146841efdc"} Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.659958 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.662275 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"182a073a-d8e7-4cc7-8476-b34d8ceb12c0","Type":"ContainerStarted","Data":"945dd4987efc5f647269bc1f656f043b8602ad2f5d907dfb61eab5ed2fbbbe21"} Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.662311 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"182a073a-d8e7-4cc7-8476-b34d8ceb12c0","Type":"ContainerStarted","Data":"de1e833eca0da0a82dfbad4dd3527b34e55062c791a6ec8cde872e9c882755a2"} Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.664353 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.664535 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.695896 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" podStartSLOduration=19.695878264 podStartE2EDuration="19.695878264s" podCreationTimestamp="2026-03-19 20:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:38.692691635 +0000 UTC m=+196.298644717" watchObservedRunningTime="2026-03-19 20:08:38.695878264 +0000 UTC m=+196.301831346" Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.733093 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" podStartSLOduration=19.73307577 podStartE2EDuration="19.73307577s" podCreationTimestamp="2026-03-19 20:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:38.728148483 +0000 UTC m=+196.334101555" watchObservedRunningTime="2026-03-19 20:08:38.73307577 +0000 UTC m=+196.339028842" Mar 19 20:08:38 crc kubenswrapper[4799]: I0319 20:08:38.742348 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.742329318 podStartE2EDuration="2.742329318s" podCreationTimestamp="2026-03-19 20:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:38.739037176 +0000 UTC m=+196.344990248" watchObservedRunningTime="2026-03-19 20:08:38.742329318 +0000 UTC m=+196.348282390" Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.125451 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06124f0-b975-4b73-b58d-f678af8cda26" path="/var/lib/kubelet/pods/a06124f0-b975-4b73-b58d-f678af8cda26/volumes" Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.126440 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c96b18-7050-4f09-b776-93b6a5c62eed" path="/var/lib/kubelet/pods/f2c96b18-7050-4f09-b776-93b6a5c62eed/volumes" Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.158316 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh"] Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.273837 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff"] Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.671539 4799 generic.go:334] "Generic (PLEG): container finished" podID="182a073a-d8e7-4cc7-8476-b34d8ceb12c0" containerID="945dd4987efc5f647269bc1f656f043b8602ad2f5d907dfb61eab5ed2fbbbe21" exitCode=0 Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.671608 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"182a073a-d8e7-4cc7-8476-b34d8ceb12c0","Type":"ContainerDied","Data":"945dd4987efc5f647269bc1f656f043b8602ad2f5d907dfb61eab5ed2fbbbe21"} Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.675340 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerStarted","Data":"f89f71ec26713ac35cc2030ba689812451062b038978170b0e8787322de311b5"} Mar 19 20:08:39 crc kubenswrapper[4799]: I0319 20:08:39.721757 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4g9c" podStartSLOduration=3.463755738 podStartE2EDuration="40.721732259s" podCreationTimestamp="2026-03-19 20:07:59 +0000 UTC" firstStartedPulling="2026-03-19 20:08:01.815540196 +0000 UTC m=+159.421493268" lastFinishedPulling="2026-03-19 20:08:39.073516717 +0000 UTC m=+196.679469789" observedRunningTime="2026-03-19 20:08:39.717636805 +0000 UTC m=+197.323589877" watchObservedRunningTime="2026-03-19 20:08:39.721732259 +0000 UTC m=+197.327685361" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.082930 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.435208 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.435269 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.693591 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" podUID="0cbae66c-d2a0-4942-8cdd-60ef4b413765" containerName="route-controller-manager" containerID="cri-o://d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c" gracePeriod=30 Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.694815 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" podUID="0b554b5b-9cf3-4306-897e-2e4e93f407f3" containerName="controller-manager" containerID="cri-o://5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720" gracePeriod=30 Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.736409 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.737409 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.747677 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.811417 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.811714 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-var-lock\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.811757 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kube-api-access\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.913752 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.913858 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-var-lock\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.913878 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.913915 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kube-api-access\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.914034 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-var-lock\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.935865 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kube-api-access\") pod \"installer-9-crc\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:40 crc kubenswrapper[4799]: I0319 20:08:40.970274 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.058680 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.116551 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kube-api-access\") pod \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.116673 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kubelet-dir\") pod \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\" (UID: \"182a073a-d8e7-4cc7-8476-b34d8ceb12c0\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.116942 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "182a073a-d8e7-4cc7-8476-b34d8ceb12c0" (UID: "182a073a-d8e7-4cc7-8476-b34d8ceb12c0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.122851 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "182a073a-d8e7-4cc7-8476-b34d8ceb12c0" (UID: "182a073a-d8e7-4cc7-8476-b34d8ceb12c0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.153178 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.158233 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.220776 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-config\") pod \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.220833 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t84dw\" (UniqueName: \"kubernetes.io/projected/0cbae66c-d2a0-4942-8cdd-60ef4b413765-kube-api-access-t84dw\") pod \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.220864 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b554b5b-9cf3-4306-897e-2e4e93f407f3-serving-cert\") pod \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.220898 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-client-ca\") pod \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.220935 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbae66c-d2a0-4942-8cdd-60ef4b413765-serving-cert\") pod \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.220980 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f84wb\" (UniqueName: \"kubernetes.io/projected/0b554b5b-9cf3-4306-897e-2e4e93f407f3-kube-api-access-f84wb\") pod \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.221014 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-client-ca\") pod \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\" (UID: \"0cbae66c-d2a0-4942-8cdd-60ef4b413765\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.221053 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-proxy-ca-bundles\") pod \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.221096 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-config\") pod \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\" (UID: \"0b554b5b-9cf3-4306-897e-2e4e93f407f3\") " Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.221321 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.221344 4799 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/182a073a-d8e7-4cc7-8476-b34d8ceb12c0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.222253 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-config" (OuterVolumeSpecName: "config") pod "0b554b5b-9cf3-4306-897e-2e4e93f407f3" (UID: "0b554b5b-9cf3-4306-897e-2e4e93f407f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.222822 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-config" (OuterVolumeSpecName: "config") pod "0cbae66c-d2a0-4942-8cdd-60ef4b413765" (UID: "0cbae66c-d2a0-4942-8cdd-60ef4b413765"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.224502 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b554b5b-9cf3-4306-897e-2e4e93f407f3" (UID: "0b554b5b-9cf3-4306-897e-2e4e93f407f3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.224759 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-client-ca" (OuterVolumeSpecName: "client-ca") pod "0cbae66c-d2a0-4942-8cdd-60ef4b413765" (UID: "0cbae66c-d2a0-4942-8cdd-60ef4b413765"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.225761 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0b554b5b-9cf3-4306-897e-2e4e93f407f3" (UID: "0b554b5b-9cf3-4306-897e-2e4e93f407f3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.228498 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b554b5b-9cf3-4306-897e-2e4e93f407f3-kube-api-access-f84wb" (OuterVolumeSpecName: "kube-api-access-f84wb") pod "0b554b5b-9cf3-4306-897e-2e4e93f407f3" (UID: "0b554b5b-9cf3-4306-897e-2e4e93f407f3"). InnerVolumeSpecName "kube-api-access-f84wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.228926 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbae66c-d2a0-4942-8cdd-60ef4b413765-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0cbae66c-d2a0-4942-8cdd-60ef4b413765" (UID: "0cbae66c-d2a0-4942-8cdd-60ef4b413765"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.230259 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b554b5b-9cf3-4306-897e-2e4e93f407f3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b554b5b-9cf3-4306-897e-2e4e93f407f3" (UID: "0b554b5b-9cf3-4306-897e-2e4e93f407f3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.231075 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbae66c-d2a0-4942-8cdd-60ef4b413765-kube-api-access-t84dw" (OuterVolumeSpecName: "kube-api-access-t84dw") pod "0cbae66c-d2a0-4942-8cdd-60ef4b413765" (UID: "0cbae66c-d2a0-4942-8cdd-60ef4b413765"). InnerVolumeSpecName "kube-api-access-t84dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322363 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f84wb\" (UniqueName: \"kubernetes.io/projected/0b554b5b-9cf3-4306-897e-2e4e93f407f3-kube-api-access-f84wb\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322406 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322417 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322425 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322434 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cbae66c-d2a0-4942-8cdd-60ef4b413765-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322443 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t84dw\" (UniqueName: \"kubernetes.io/projected/0cbae66c-d2a0-4942-8cdd-60ef4b413765-kube-api-access-t84dw\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322451 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b554b5b-9cf3-4306-897e-2e4e93f407f3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322461 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b554b5b-9cf3-4306-897e-2e4e93f407f3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.322469 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cbae66c-d2a0-4942-8cdd-60ef4b413765-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.484671 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 20:08:41 crc kubenswrapper[4799]: W0319 20:08:41.494552 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3e94a729_3d36_44ce_8b8f_29e9183ec3ff.slice/crio-5f7f639cf101f7232883e57690e8ef8809e24024330541fde1bd49d0833281ca WatchSource:0}: Error finding container 5f7f639cf101f7232883e57690e8ef8809e24024330541fde1bd49d0833281ca: Status 404 returned error can't find the container with id 5f7f639cf101f7232883e57690e8ef8809e24024330541fde1bd49d0833281ca Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.684603 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-m4g9c" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="registry-server" probeResult="failure" output=< Mar 19 20:08:41 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:08:41 crc kubenswrapper[4799]: > Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.696911 4799 generic.go:334] "Generic (PLEG): container finished" podID="0b554b5b-9cf3-4306-897e-2e4e93f407f3" containerID="5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720" exitCode=0 Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.697036 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.697721 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" event={"ID":"0b554b5b-9cf3-4306-897e-2e4e93f407f3","Type":"ContainerDied","Data":"5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720"} Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.697818 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh" event={"ID":"0b554b5b-9cf3-4306-897e-2e4e93f407f3","Type":"ContainerDied","Data":"289f58b8b8727a2e8e1721a0c6afb3950ac6bbbece655670e884c4146841efdc"} Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.697880 4799 scope.go:117] "RemoveContainer" containerID="5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.700072 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"182a073a-d8e7-4cc7-8476-b34d8ceb12c0","Type":"ContainerDied","Data":"de1e833eca0da0a82dfbad4dd3527b34e55062c791a6ec8cde872e9c882755a2"} Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.700110 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de1e833eca0da0a82dfbad4dd3527b34e55062c791a6ec8cde872e9c882755a2" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.700125 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.701958 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3e94a729-3d36-44ce-8b8f-29e9183ec3ff","Type":"ContainerStarted","Data":"5f7f639cf101f7232883e57690e8ef8809e24024330541fde1bd49d0833281ca"} Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.704042 4799 generic.go:334] "Generic (PLEG): container finished" podID="0cbae66c-d2a0-4942-8cdd-60ef4b413765" containerID="d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c" exitCode=0 Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.704085 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.704151 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" event={"ID":"0cbae66c-d2a0-4942-8cdd-60ef4b413765","Type":"ContainerDied","Data":"d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c"} Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.704179 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff" event={"ID":"0cbae66c-d2a0-4942-8cdd-60ef4b413765","Type":"ContainerDied","Data":"b0bbaa2010a0bde2a4a7d51eada8193523476219628972baaf7b48902ea9bd9b"} Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.713866 4799 scope.go:117] "RemoveContainer" containerID="5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720" Mar 19 20:08:41 crc kubenswrapper[4799]: E0319 20:08:41.714190 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720\": container with ID starting with 5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720 not found: ID does not exist" containerID="5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.714268 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720"} err="failed to get container status \"5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720\": rpc error: code = NotFound desc = could not find container \"5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720\": container with ID starting with 5f532478df0d959686032e377fb48944035b4c452ad8bbf63bfef287f5674720 not found: ID does not exist" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.714309 4799 scope.go:117] "RemoveContainer" containerID="d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.721710 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh"] Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.727714 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b44b4b9f4-f8djh"] Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.737984 4799 scope.go:117] "RemoveContainer" containerID="d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c" Mar 19 20:08:41 crc kubenswrapper[4799]: E0319 20:08:41.738655 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c\": container with ID starting with d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c not found: ID does not exist" containerID="d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.738695 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c"} err="failed to get container status \"d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c\": rpc error: code = NotFound desc = could not find container \"d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c\": container with ID starting with d9c582ed107b91d113b03d464a3d491c7ab257d3ec7651a055d17f152c456e1c not found: ID does not exist" Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.740135 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff"] Mar 19 20:08:41 crc kubenswrapper[4799]: I0319 20:08:41.744728 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55ff6c6977-v9wff"] Mar 19 20:08:42 crc kubenswrapper[4799]: I0319 20:08:42.655973 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j5t7c"] Mar 19 20:08:42 crc kubenswrapper[4799]: I0319 20:08:42.709091 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3e94a729-3d36-44ce-8b8f-29e9183ec3ff","Type":"ContainerStarted","Data":"7fda262e2ec063127a6ea9b484195d5a3b5c9c2ac225f5902c71aa2df46171f3"} Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.119453 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.124577 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b554b5b-9cf3-4306-897e-2e4e93f407f3" path="/var/lib/kubelet/pods/0b554b5b-9cf3-4306-897e-2e4e93f407f3/volumes" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.125334 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbae66c-d2a0-4942-8cdd-60ef4b413765" path="/var/lib/kubelet/pods/0cbae66c-d2a0-4942-8cdd-60ef4b413765/volumes" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.438300 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.438285353 podStartE2EDuration="3.438285353s" podCreationTimestamp="2026-03-19 20:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:42.739032149 +0000 UTC m=+200.344985221" watchObservedRunningTime="2026-03-19 20:08:43.438285353 +0000 UTC m=+201.044238425" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.440269 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj"] Mar 19 20:08:43 crc kubenswrapper[4799]: E0319 20:08:43.440767 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbae66c-d2a0-4942-8cdd-60ef4b413765" containerName="route-controller-manager" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.440804 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbae66c-d2a0-4942-8cdd-60ef4b413765" containerName="route-controller-manager" Mar 19 20:08:43 crc kubenswrapper[4799]: E0319 20:08:43.440827 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b554b5b-9cf3-4306-897e-2e4e93f407f3" containerName="controller-manager" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.440844 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b554b5b-9cf3-4306-897e-2e4e93f407f3" containerName="controller-manager" Mar 19 20:08:43 crc kubenswrapper[4799]: E0319 20:08:43.440888 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182a073a-d8e7-4cc7-8476-b34d8ceb12c0" containerName="pruner" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.440901 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="182a073a-d8e7-4cc7-8476-b34d8ceb12c0" containerName="pruner" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.441069 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbae66c-d2a0-4942-8cdd-60ef4b413765" containerName="route-controller-manager" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.441107 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b554b5b-9cf3-4306-897e-2e4e93f407f3" containerName="controller-manager" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.441131 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="182a073a-d8e7-4cc7-8476-b34d8ceb12c0" containerName="pruner" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.441847 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.442209 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5"] Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.443040 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.443585 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.443788 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.444013 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.444207 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.445708 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.445984 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.446064 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.446195 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.446409 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.446473 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.446536 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.446657 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.453605 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.458274 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj"] Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.464997 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5"] Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552513 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5939febf-2185-45c7-9d20-1edd99d929f8-serving-cert\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552553 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-client-ca\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552578 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6ks\" (UniqueName: \"kubernetes.io/projected/5939febf-2185-45c7-9d20-1edd99d929f8-kube-api-access-9x6ks\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552609 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-config\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552720 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-client-ca\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552737 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-config\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552761 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97553c56-e72d-48bc-bc0c-5b66119b7492-serving-cert\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552792 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzk4z\" (UniqueName: \"kubernetes.io/projected/97553c56-e72d-48bc-bc0c-5b66119b7492-kube-api-access-fzk4z\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.552811 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-proxy-ca-bundles\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.654233 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzk4z\" (UniqueName: \"kubernetes.io/projected/97553c56-e72d-48bc-bc0c-5b66119b7492-kube-api-access-fzk4z\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.654546 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-proxy-ca-bundles\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.654683 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5939febf-2185-45c7-9d20-1edd99d929f8-serving-cert\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.654793 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-client-ca\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.654896 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6ks\" (UniqueName: \"kubernetes.io/projected/5939febf-2185-45c7-9d20-1edd99d929f8-kube-api-access-9x6ks\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.655017 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-config\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.655101 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-config\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.655189 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-client-ca\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.655474 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97553c56-e72d-48bc-bc0c-5b66119b7492-serving-cert\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.655776 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-client-ca\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.655800 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-proxy-ca-bundles\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.656686 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-config\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.657117 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-config\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.657280 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-client-ca\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.661038 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97553c56-e72d-48bc-bc0c-5b66119b7492-serving-cert\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.661742 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5939febf-2185-45c7-9d20-1edd99d929f8-serving-cert\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.675201 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzk4z\" (UniqueName: \"kubernetes.io/projected/97553c56-e72d-48bc-bc0c-5b66119b7492-kube-api-access-fzk4z\") pod \"route-controller-manager-5bc8644d5b-p8qr5\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.686997 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6ks\" (UniqueName: \"kubernetes.io/projected/5939febf-2185-45c7-9d20-1edd99d929f8-kube-api-access-9x6ks\") pod \"controller-manager-7df54f6c4d-jh5rj\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.717837 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.719731 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177"} Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.720692 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.741876 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=74.741858143 podStartE2EDuration="1m14.741858143s" podCreationTimestamp="2026-03-19 20:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:43.739503127 +0000 UTC m=+201.345456289" watchObservedRunningTime="2026-03-19 20:08:43.741858143 +0000 UTC m=+201.347811215" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.770310 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:43 crc kubenswrapper[4799]: I0319 20:08:43.777425 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.185437 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5"] Mar 19 20:08:44 crc kubenswrapper[4799]: W0319 20:08:44.190836 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97553c56_e72d_48bc_bc0c_5b66119b7492.slice/crio-77ee01957011d3963684a3ebefeff61560461413f817016960715cf04debc43e WatchSource:0}: Error finding container 77ee01957011d3963684a3ebefeff61560461413f817016960715cf04debc43e: Status 404 returned error can't find the container with id 77ee01957011d3963684a3ebefeff61560461413f817016960715cf04debc43e Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.245703 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj"] Mar 19 20:08:44 crc kubenswrapper[4799]: W0319 20:08:44.261003 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5939febf_2185_45c7_9d20_1edd99d929f8.slice/crio-317694b58caa2d8d724d20a3405d3c6954e1f1126e24e25dae1252e3c666e4b4 WatchSource:0}: Error finding container 317694b58caa2d8d724d20a3405d3c6954e1f1126e24e25dae1252e3c666e4b4: Status 404 returned error can't find the container with id 317694b58caa2d8d724d20a3405d3c6954e1f1126e24e25dae1252e3c666e4b4 Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.726268 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" event={"ID":"5939febf-2185-45c7-9d20-1edd99d929f8","Type":"ContainerStarted","Data":"0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16"} Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.726314 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" event={"ID":"5939febf-2185-45c7-9d20-1edd99d929f8","Type":"ContainerStarted","Data":"317694b58caa2d8d724d20a3405d3c6954e1f1126e24e25dae1252e3c666e4b4"} Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.726574 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.727933 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" event={"ID":"97553c56-e72d-48bc-bc0c-5b66119b7492","Type":"ContainerStarted","Data":"37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e"} Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.727963 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" event={"ID":"97553c56-e72d-48bc-bc0c-5b66119b7492","Type":"ContainerStarted","Data":"77ee01957011d3963684a3ebefeff61560461413f817016960715cf04debc43e"} Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.728176 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.742324 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.747672 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.788837 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" podStartSLOduration=5.788818557 podStartE2EDuration="5.788818557s" podCreationTimestamp="2026-03-19 20:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:44.763376018 +0000 UTC m=+202.369329090" watchObservedRunningTime="2026-03-19 20:08:44.788818557 +0000 UTC m=+202.394771629" Mar 19 20:08:44 crc kubenswrapper[4799]: I0319 20:08:44.822309 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" podStartSLOduration=5.822291049 podStartE2EDuration="5.822291049s" podCreationTimestamp="2026-03-19 20:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:08:44.821980481 +0000 UTC m=+202.427933553" watchObservedRunningTime="2026-03-19 20:08:44.822291049 +0000 UTC m=+202.428244121" Mar 19 20:08:47 crc kubenswrapper[4799]: I0319 20:08:47.753444 4799 generic.go:334] "Generic (PLEG): container finished" podID="af016f47-a261-463b-98e4-710fdb4557bb" containerID="e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba" exitCode=0 Mar 19 20:08:47 crc kubenswrapper[4799]: I0319 20:08:47.753527 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2587j" event={"ID":"af016f47-a261-463b-98e4-710fdb4557bb","Type":"ContainerDied","Data":"e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba"} Mar 19 20:08:47 crc kubenswrapper[4799]: I0319 20:08:47.757158 4799 generic.go:334] "Generic (PLEG): container finished" podID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerID="1133972b262689629f79ba55be1b1b72d5622255a70981f85f69392e63511a95" exitCode=0 Mar 19 20:08:47 crc kubenswrapper[4799]: I0319 20:08:47.757248 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gp4f" event={"ID":"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40","Type":"ContainerDied","Data":"1133972b262689629f79ba55be1b1b72d5622255a70981f85f69392e63511a95"} Mar 19 20:08:48 crc kubenswrapper[4799]: I0319 20:08:48.767450 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2587j" event={"ID":"af016f47-a261-463b-98e4-710fdb4557bb","Type":"ContainerStarted","Data":"7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451"} Mar 19 20:08:48 crc kubenswrapper[4799]: I0319 20:08:48.770500 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gp4f" event={"ID":"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40","Type":"ContainerStarted","Data":"cb60445b1d4c3eae208e85e127a8f2f163fb086903f8eea73854648bcebd72fb"} Mar 19 20:08:48 crc kubenswrapper[4799]: I0319 20:08:48.786205 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2587j" podStartSLOduration=3.505640135 podStartE2EDuration="48.786187206s" podCreationTimestamp="2026-03-19 20:08:00 +0000 UTC" firstStartedPulling="2026-03-19 20:08:03.02962681 +0000 UTC m=+160.635579882" lastFinishedPulling="2026-03-19 20:08:48.310173881 +0000 UTC m=+205.916126953" observedRunningTime="2026-03-19 20:08:48.784803097 +0000 UTC m=+206.390756169" watchObservedRunningTime="2026-03-19 20:08:48.786187206 +0000 UTC m=+206.392140278" Mar 19 20:08:49 crc kubenswrapper[4799]: I0319 20:08:49.145030 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2gp4f" podStartSLOduration=3.6581951249999998 podStartE2EDuration="51.145010855s" podCreationTimestamp="2026-03-19 20:07:58 +0000 UTC" firstStartedPulling="2026-03-19 20:08:00.711202026 +0000 UTC m=+158.317155098" lastFinishedPulling="2026-03-19 20:08:48.198017756 +0000 UTC m=+205.803970828" observedRunningTime="2026-03-19 20:08:48.802165971 +0000 UTC m=+206.408119063" watchObservedRunningTime="2026-03-19 20:08:49.145010855 +0000 UTC m=+206.750963927" Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.475746 4799 csr.go:261] certificate signing request csr-xhsfn is approved, waiting to be issued Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.483824 4799 csr.go:257] certificate signing request csr-xhsfn is issued Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.485565 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.522215 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.788762 4799 generic.go:334] "Generic (PLEG): container finished" podID="31aa7077-55c5-426b-a92f-c93b8d767105" containerID="f2c69ecf363bed623988c3cdc0da8c035cc1467783192da8b53610c7e2ffdc3f" exitCode=0 Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.788832 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" event={"ID":"31aa7077-55c5-426b-a92f-c93b8d767105","Type":"ContainerDied","Data":"f2c69ecf363bed623988c3cdc0da8c035cc1467783192da8b53610c7e2ffdc3f"} Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.790361 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerStarted","Data":"9bca2bfa6141ae9428d30cb75a60bad31ae20ec0c1e6cdd7cdf82494c479c3cf"} Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.791707 4799 generic.go:334] "Generic (PLEG): container finished" podID="9365ab34-43df-42ee-bc04-baeba5579717" containerID="2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb" exitCode=0 Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.791743 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq5wd" event={"ID":"9365ab34-43df-42ee-bc04-baeba5579717","Type":"ContainerDied","Data":"2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb"} Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.900348 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.901036 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:50 crc kubenswrapper[4799]: I0319 20:08:50.973985 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:08:51 crc kubenswrapper[4799]: I0319 20:08:51.485660 4799 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-19 11:46:18.677153705 +0000 UTC Mar 19 20:08:51 crc kubenswrapper[4799]: I0319 20:08:51.485692 4799 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5871h37m27.191463866s for next certificate rotation Mar 19 20:08:51 crc kubenswrapper[4799]: I0319 20:08:51.800986 4799 generic.go:334] "Generic (PLEG): container finished" podID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerID="9bca2bfa6141ae9428d30cb75a60bad31ae20ec0c1e6cdd7cdf82494c479c3cf" exitCode=0 Mar 19 20:08:51 crc kubenswrapper[4799]: I0319 20:08:51.801086 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerDied","Data":"9bca2bfa6141ae9428d30cb75a60bad31ae20ec0c1e6cdd7cdf82494c479c3cf"} Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.471945 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.486260 4799 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-07 07:56:30.81553439 +0000 UTC Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.486324 4799 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7043h47m38.329213317s for next certificate rotation Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.564552 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw98s\" (UniqueName: \"kubernetes.io/projected/31aa7077-55c5-426b-a92f-c93b8d767105-kube-api-access-fw98s\") pod \"31aa7077-55c5-426b-a92f-c93b8d767105\" (UID: \"31aa7077-55c5-426b-a92f-c93b8d767105\") " Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.572702 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31aa7077-55c5-426b-a92f-c93b8d767105-kube-api-access-fw98s" (OuterVolumeSpecName: "kube-api-access-fw98s") pod "31aa7077-55c5-426b-a92f-c93b8d767105" (UID: "31aa7077-55c5-426b-a92f-c93b8d767105"). InnerVolumeSpecName "kube-api-access-fw98s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.666673 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw98s\" (UniqueName: \"kubernetes.io/projected/31aa7077-55c5-426b-a92f-c93b8d767105-kube-api-access-fw98s\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.807104 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" event={"ID":"31aa7077-55c5-426b-a92f-c93b8d767105","Type":"ContainerDied","Data":"a5c2b77e039b2bf67c04609cec251c960d92570f2e04adfd7e3c76452fd9f1e2"} Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.807152 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-5gcs8" Mar 19 20:08:52 crc kubenswrapper[4799]: I0319 20:08:52.807161 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c2b77e039b2bf67c04609cec251c960d92570f2e04adfd7e3c76452fd9f1e2" Mar 19 20:08:53 crc kubenswrapper[4799]: I0319 20:08:53.651074 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.821827 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerStarted","Data":"47874d181aea8606c62bcfbb60557de06f8956f419be258bdc88d203b19e29cc"} Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.823883 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq5wd" event={"ID":"9365ab34-43df-42ee-bc04-baeba5579717","Type":"ContainerStarted","Data":"dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740"} Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.825471 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerStarted","Data":"62e948aad1826163f2570189998fb61749dd6477f308ce35ec2c90edbf07fddd"} Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.827987 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerStarted","Data":"288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459"} Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.829725 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerStarted","Data":"8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0"} Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.844208 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k2dks" podStartSLOduration=2.433910834 podStartE2EDuration="53.844193576s" podCreationTimestamp="2026-03-19 20:08:01 +0000 UTC" firstStartedPulling="2026-03-19 20:08:03.002075291 +0000 UTC m=+160.608028363" lastFinishedPulling="2026-03-19 20:08:54.412358033 +0000 UTC m=+212.018311105" observedRunningTime="2026-03-19 20:08:54.842143539 +0000 UTC m=+212.448096611" watchObservedRunningTime="2026-03-19 20:08:54.844193576 +0000 UTC m=+212.450146648" Mar 19 20:08:54 crc kubenswrapper[4799]: I0319 20:08:54.867464 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sq5wd" podStartSLOduration=5.545980216 podStartE2EDuration="56.867446354s" podCreationTimestamp="2026-03-19 20:07:58 +0000 UTC" firstStartedPulling="2026-03-19 20:08:01.876933108 +0000 UTC m=+159.482886180" lastFinishedPulling="2026-03-19 20:08:53.198399236 +0000 UTC m=+210.804352318" observedRunningTime="2026-03-19 20:08:54.863940647 +0000 UTC m=+212.469893709" watchObservedRunningTime="2026-03-19 20:08:54.867446354 +0000 UTC m=+212.473399426" Mar 19 20:08:55 crc kubenswrapper[4799]: I0319 20:08:55.836592 4799 generic.go:334] "Generic (PLEG): container finished" podID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerID="288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459" exitCode=0 Mar 19 20:08:55 crc kubenswrapper[4799]: I0319 20:08:55.836665 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerDied","Data":"288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459"} Mar 19 20:08:55 crc kubenswrapper[4799]: I0319 20:08:55.849576 4799 generic.go:334] "Generic (PLEG): container finished" podID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerID="62e948aad1826163f2570189998fb61749dd6477f308ce35ec2c90edbf07fddd" exitCode=0 Mar 19 20:08:55 crc kubenswrapper[4799]: I0319 20:08:55.849649 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerDied","Data":"62e948aad1826163f2570189998fb61749dd6477f308ce35ec2c90edbf07fddd"} Mar 19 20:08:55 crc kubenswrapper[4799]: I0319 20:08:55.859794 4799 generic.go:334] "Generic (PLEG): container finished" podID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerID="8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0" exitCode=0 Mar 19 20:08:55 crc kubenswrapper[4799]: I0319 20:08:55.859964 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerDied","Data":"8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0"} Mar 19 20:08:56 crc kubenswrapper[4799]: I0319 20:08:56.867310 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerStarted","Data":"e57a2e803b83ab7674ce28943dc2ded590399d89fe3a57d3183069d7ac4756fe"} Mar 19 20:08:56 crc kubenswrapper[4799]: I0319 20:08:56.870982 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerStarted","Data":"717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4"} Mar 19 20:08:56 crc kubenswrapper[4799]: I0319 20:08:56.873847 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerStarted","Data":"860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3"} Mar 19 20:08:56 crc kubenswrapper[4799]: I0319 20:08:56.888787 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lf2lf" podStartSLOduration=5.463925025 podStartE2EDuration="59.88876566s" podCreationTimestamp="2026-03-19 20:07:57 +0000 UTC" firstStartedPulling="2026-03-19 20:08:01.970444364 +0000 UTC m=+159.576397436" lastFinishedPulling="2026-03-19 20:08:56.395284989 +0000 UTC m=+214.001238071" observedRunningTime="2026-03-19 20:08:56.888609856 +0000 UTC m=+214.494562958" watchObservedRunningTime="2026-03-19 20:08:56.88876566 +0000 UTC m=+214.494718742" Mar 19 20:08:56 crc kubenswrapper[4799]: I0319 20:08:56.912603 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fgtxw" podStartSLOduration=3.304318162 podStartE2EDuration="58.912562383s" podCreationTimestamp="2026-03-19 20:07:58 +0000 UTC" firstStartedPulling="2026-03-19 20:08:00.754040558 +0000 UTC m=+158.359993630" lastFinishedPulling="2026-03-19 20:08:56.362284749 +0000 UTC m=+213.968237851" observedRunningTime="2026-03-19 20:08:56.909303763 +0000 UTC m=+214.515256845" watchObservedRunningTime="2026-03-19 20:08:56.912562383 +0000 UTC m=+214.518515465" Mar 19 20:08:56 crc kubenswrapper[4799]: I0319 20:08:56.931377 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q5qgk" podStartSLOduration=3.900213494 podStartE2EDuration="55.931360457s" podCreationTimestamp="2026-03-19 20:08:01 +0000 UTC" firstStartedPulling="2026-03-19 20:08:04.277897313 +0000 UTC m=+161.883850385" lastFinishedPulling="2026-03-19 20:08:56.309044256 +0000 UTC m=+213.914997348" observedRunningTime="2026-03-19 20:08:56.928368254 +0000 UTC m=+214.534321346" watchObservedRunningTime="2026-03-19 20:08:56.931360457 +0000 UTC m=+214.537313529" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.536399 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.536716 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.580121 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.607239 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.607285 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.650842 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.710098 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.710317 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.752875 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.927625 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.966630 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:08:58 crc kubenswrapper[4799]: I0319 20:08:58.966691 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.009072 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.158106 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj"] Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.158332 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" podUID="5939febf-2185-45c7-9d20-1edd99d929f8" containerName="controller-manager" containerID="cri-o://0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16" gracePeriod=30 Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.163578 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5"] Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.163787 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" podUID="97553c56-e72d-48bc-bc0c-5b66119b7492" containerName="route-controller-manager" containerID="cri-o://37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e" gracePeriod=30 Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.748940 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.802177 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866252 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-config\") pod \"97553c56-e72d-48bc-bc0c-5b66119b7492\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866324 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x6ks\" (UniqueName: \"kubernetes.io/projected/5939febf-2185-45c7-9d20-1edd99d929f8-kube-api-access-9x6ks\") pod \"5939febf-2185-45c7-9d20-1edd99d929f8\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866416 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-config\") pod \"5939febf-2185-45c7-9d20-1edd99d929f8\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866456 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzk4z\" (UniqueName: \"kubernetes.io/projected/97553c56-e72d-48bc-bc0c-5b66119b7492-kube-api-access-fzk4z\") pod \"97553c56-e72d-48bc-bc0c-5b66119b7492\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866496 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97553c56-e72d-48bc-bc0c-5b66119b7492-serving-cert\") pod \"97553c56-e72d-48bc-bc0c-5b66119b7492\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866534 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-client-ca\") pod \"5939febf-2185-45c7-9d20-1edd99d929f8\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866567 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-proxy-ca-bundles\") pod \"5939febf-2185-45c7-9d20-1edd99d929f8\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866613 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5939febf-2185-45c7-9d20-1edd99d929f8-serving-cert\") pod \"5939febf-2185-45c7-9d20-1edd99d929f8\" (UID: \"5939febf-2185-45c7-9d20-1edd99d929f8\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.866676 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-client-ca\") pod \"97553c56-e72d-48bc-bc0c-5b66119b7492\" (UID: \"97553c56-e72d-48bc-bc0c-5b66119b7492\") " Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.867267 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "5939febf-2185-45c7-9d20-1edd99d929f8" (UID: "5939febf-2185-45c7-9d20-1edd99d929f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.867285 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-config" (OuterVolumeSpecName: "config") pod "97553c56-e72d-48bc-bc0c-5b66119b7492" (UID: "97553c56-e72d-48bc-bc0c-5b66119b7492"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.867576 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5939febf-2185-45c7-9d20-1edd99d929f8" (UID: "5939febf-2185-45c7-9d20-1edd99d929f8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.867668 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-client-ca" (OuterVolumeSpecName: "client-ca") pod "97553c56-e72d-48bc-bc0c-5b66119b7492" (UID: "97553c56-e72d-48bc-bc0c-5b66119b7492"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.870830 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-config" (OuterVolumeSpecName: "config") pod "5939febf-2185-45c7-9d20-1edd99d929f8" (UID: "5939febf-2185-45c7-9d20-1edd99d929f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.875572 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5939febf-2185-45c7-9d20-1edd99d929f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5939febf-2185-45c7-9d20-1edd99d929f8" (UID: "5939febf-2185-45c7-9d20-1edd99d929f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.875591 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97553c56-e72d-48bc-bc0c-5b66119b7492-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97553c56-e72d-48bc-bc0c-5b66119b7492" (UID: "97553c56-e72d-48bc-bc0c-5b66119b7492"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.875613 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5939febf-2185-45c7-9d20-1edd99d929f8-kube-api-access-9x6ks" (OuterVolumeSpecName: "kube-api-access-9x6ks") pod "5939febf-2185-45c7-9d20-1edd99d929f8" (UID: "5939febf-2185-45c7-9d20-1edd99d929f8"). InnerVolumeSpecName "kube-api-access-9x6ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.875649 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97553c56-e72d-48bc-bc0c-5b66119b7492-kube-api-access-fzk4z" (OuterVolumeSpecName: "kube-api-access-fzk4z") pod "97553c56-e72d-48bc-bc0c-5b66119b7492" (UID: "97553c56-e72d-48bc-bc0c-5b66119b7492"). InnerVolumeSpecName "kube-api-access-fzk4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.890896 4799 generic.go:334] "Generic (PLEG): container finished" podID="5939febf-2185-45c7-9d20-1edd99d929f8" containerID="0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16" exitCode=0 Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.890937 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" event={"ID":"5939febf-2185-45c7-9d20-1edd99d929f8","Type":"ContainerDied","Data":"0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16"} Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.890951 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.891029 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj" event={"ID":"5939febf-2185-45c7-9d20-1edd99d929f8","Type":"ContainerDied","Data":"317694b58caa2d8d724d20a3405d3c6954e1f1126e24e25dae1252e3c666e4b4"} Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.891051 4799 scope.go:117] "RemoveContainer" containerID="0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.893906 4799 generic.go:334] "Generic (PLEG): container finished" podID="97553c56-e72d-48bc-bc0c-5b66119b7492" containerID="37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e" exitCode=0 Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.894054 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" event={"ID":"97553c56-e72d-48bc-bc0c-5b66119b7492","Type":"ContainerDied","Data":"37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e"} Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.894102 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" event={"ID":"97553c56-e72d-48bc-bc0c-5b66119b7492","Type":"ContainerDied","Data":"77ee01957011d3963684a3ebefeff61560461413f817016960715cf04debc43e"} Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.894157 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.907334 4799 scope.go:117] "RemoveContainer" containerID="0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16" Mar 19 20:08:59 crc kubenswrapper[4799]: E0319 20:08:59.907938 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16\": container with ID starting with 0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16 not found: ID does not exist" containerID="0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.907976 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16"} err="failed to get container status \"0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16\": rpc error: code = NotFound desc = could not find container \"0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16\": container with ID starting with 0a15f7a28bf127b59b298757853938906cb190cafa4b34bbded4378187a0fc16 not found: ID does not exist" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.907997 4799 scope.go:117] "RemoveContainer" containerID="37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.929782 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj"] Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.932712 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7df54f6c4d-jh5rj"] Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.937890 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.937956 4799 scope.go:117] "RemoveContainer" containerID="37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e" Mar 19 20:08:59 crc kubenswrapper[4799]: E0319 20:08:59.938773 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e\": container with ID starting with 37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e not found: ID does not exist" containerID="37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.938807 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e"} err="failed to get container status \"37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e\": rpc error: code = NotFound desc = could not find container \"37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e\": container with ID starting with 37baf75297b127a0fb02921efc3883972168bb0e672380f94437fafc8668197e not found: ID does not exist" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.943373 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5"] Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.946126 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bc8644d5b-p8qr5"] Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.968439 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969548 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzk4z\" (UniqueName: \"kubernetes.io/projected/97553c56-e72d-48bc-bc0c-5b66119b7492-kube-api-access-fzk4z\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969588 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97553c56-e72d-48bc-bc0c-5b66119b7492-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969607 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969636 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5939febf-2185-45c7-9d20-1edd99d929f8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969650 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5939febf-2185-45c7-9d20-1edd99d929f8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969663 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969674 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97553c56-e72d-48bc-bc0c-5b66119b7492-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:59 crc kubenswrapper[4799]: I0319 20:08:59.969688 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x6ks\" (UniqueName: \"kubernetes.io/projected/5939febf-2185-45c7-9d20-1edd99d929f8-kube-api-access-9x6ks\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.450818 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6855947fbf-q8zxk"] Mar 19 20:09:00 crc kubenswrapper[4799]: E0319 20:09:00.451414 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5939febf-2185-45c7-9d20-1edd99d929f8" containerName="controller-manager" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.451435 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5939febf-2185-45c7-9d20-1edd99d929f8" containerName="controller-manager" Mar 19 20:09:00 crc kubenswrapper[4799]: E0319 20:09:00.451447 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31aa7077-55c5-426b-a92f-c93b8d767105" containerName="oc" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.451456 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="31aa7077-55c5-426b-a92f-c93b8d767105" containerName="oc" Mar 19 20:09:00 crc kubenswrapper[4799]: E0319 20:09:00.451484 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97553c56-e72d-48bc-bc0c-5b66119b7492" containerName="route-controller-manager" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.451492 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="97553c56-e72d-48bc-bc0c-5b66119b7492" containerName="route-controller-manager" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.451644 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="31aa7077-55c5-426b-a92f-c93b8d767105" containerName="oc" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.451664 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="5939febf-2185-45c7-9d20-1edd99d929f8" containerName="controller-manager" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.451680 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="97553c56-e72d-48bc-bc0c-5b66119b7492" containerName="route-controller-manager" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.452146 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.457602 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.458109 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.459020 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.459195 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.464162 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx"] Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.465375 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.465574 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.465629 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.470590 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.470912 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.472259 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.473304 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.473310 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.473483 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.475763 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6855947fbf-q8zxk"] Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.493272 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx"] Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.510970 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.584520 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b456a663-a0de-426b-9f78-69f04d576342-serving-cert\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.584571 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-client-ca\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.584648 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-config\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.584738 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d65d512-1dba-4680-81ba-8e67489154ab-serving-cert\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.584891 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6559\" (UniqueName: \"kubernetes.io/projected/b456a663-a0de-426b-9f78-69f04d576342-kube-api-access-j6559\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.585004 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-client-ca\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.585077 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-proxy-ca-bundles\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.585170 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4x8d\" (UniqueName: \"kubernetes.io/projected/7d65d512-1dba-4680-81ba-8e67489154ab-kube-api-access-r4x8d\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.585291 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-config\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686548 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-client-ca\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686604 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-proxy-ca-bundles\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686637 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4x8d\" (UniqueName: \"kubernetes.io/projected/7d65d512-1dba-4680-81ba-8e67489154ab-kube-api-access-r4x8d\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686675 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-config\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686710 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b456a663-a0de-426b-9f78-69f04d576342-serving-cert\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686726 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-client-ca\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686740 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-config\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686755 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d65d512-1dba-4680-81ba-8e67489154ab-serving-cert\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.686774 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6559\" (UniqueName: \"kubernetes.io/projected/b456a663-a0de-426b-9f78-69f04d576342-kube-api-access-j6559\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.688211 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-client-ca\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.688842 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-proxy-ca-bundles\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.688897 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-config\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.689345 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-client-ca\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.689849 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-config\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.702775 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d65d512-1dba-4680-81ba-8e67489154ab-serving-cert\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.704727 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b456a663-a0de-426b-9f78-69f04d576342-serving-cert\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.705601 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6559\" (UniqueName: \"kubernetes.io/projected/b456a663-a0de-426b-9f78-69f04d576342-kube-api-access-j6559\") pod \"controller-manager-6855947fbf-q8zxk\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.723747 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4x8d\" (UniqueName: \"kubernetes.io/projected/7d65d512-1dba-4680-81ba-8e67489154ab-kube-api-access-r4x8d\") pod \"route-controller-manager-5c56949c99-l4vtx\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.770749 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.798824 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:00 crc kubenswrapper[4799]: I0319 20:09:00.964926 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.139268 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5939febf-2185-45c7-9d20-1edd99d929f8" path="/var/lib/kubelet/pods/5939febf-2185-45c7-9d20-1edd99d929f8/volumes" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.140755 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97553c56-e72d-48bc-bc0c-5b66119b7492" path="/var/lib/kubelet/pods/97553c56-e72d-48bc-bc0c-5b66119b7492/volumes" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.214821 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6855947fbf-q8zxk"] Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.288503 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx"] Mar 19 20:09:01 crc kubenswrapper[4799]: W0319 20:09:01.299105 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d65d512_1dba_4680_81ba_8e67489154ab.slice/crio-292192e08e37bc4c4137fdd691580bc715a049cb739696df18b53505ebd50a92 WatchSource:0}: Error finding container 292192e08e37bc4c4137fdd691580bc715a049cb739696df18b53505ebd50a92: Status 404 returned error can't find the container with id 292192e08e37bc4c4137fdd691580bc715a049cb739696df18b53505ebd50a92 Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.547864 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.548640 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.885749 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.885809 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.923734 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" event={"ID":"7d65d512-1dba-4680-81ba-8e67489154ab","Type":"ContainerStarted","Data":"0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494"} Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.923785 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" event={"ID":"7d65d512-1dba-4680-81ba-8e67489154ab","Type":"ContainerStarted","Data":"292192e08e37bc4c4137fdd691580bc715a049cb739696df18b53505ebd50a92"} Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.927804 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" event={"ID":"b456a663-a0de-426b-9f78-69f04d576342","Type":"ContainerStarted","Data":"7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998"} Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.927836 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" event={"ID":"b456a663-a0de-426b-9f78-69f04d576342","Type":"ContainerStarted","Data":"f91e744c3cf054116f5a93958bbe4844afd618b549373d2f3fdc8f11a2dce362"} Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.928266 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.929719 4799 patch_prober.go:28] interesting pod/controller-manager-6855947fbf-q8zxk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.929774 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" podUID="b456a663-a0de-426b-9f78-69f04d576342" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 19 20:09:01 crc kubenswrapper[4799]: I0319 20:09:01.960860 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" podStartSLOduration=2.960825696 podStartE2EDuration="2.960825696s" podCreationTimestamp="2026-03-19 20:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:09:01.951438515 +0000 UTC m=+219.557391587" watchObservedRunningTime="2026-03-19 20:09:01.960825696 +0000 UTC m=+219.566778808" Mar 19 20:09:02 crc kubenswrapper[4799]: I0319 20:09:02.594509 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2dks" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="registry-server" probeResult="failure" output=< Mar 19 20:09:02 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:09:02 crc kubenswrapper[4799]: > Mar 19 20:09:02 crc kubenswrapper[4799]: I0319 20:09:02.940530 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q5qgk" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="registry-server" probeResult="failure" output=< Mar 19 20:09:02 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:09:02 crc kubenswrapper[4799]: > Mar 19 20:09:02 crc kubenswrapper[4799]: I0319 20:09:02.943422 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:02 crc kubenswrapper[4799]: I0319 20:09:02.961118 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq5wd"] Mar 19 20:09:02 crc kubenswrapper[4799]: I0319 20:09:02.961640 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sq5wd" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="registry-server" containerID="cri-o://dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740" gracePeriod=2 Mar 19 20:09:02 crc kubenswrapper[4799]: I0319 20:09:02.976438 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" podStartSLOduration=3.976411397 podStartE2EDuration="3.976411397s" podCreationTimestamp="2026-03-19 20:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:09:02.965973266 +0000 UTC m=+220.571926378" watchObservedRunningTime="2026-03-19 20:09:02.976411397 +0000 UTC m=+220.582364509" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.376128 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.422175 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-utilities\") pod \"9365ab34-43df-42ee-bc04-baeba5579717\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.422316 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-catalog-content\") pod \"9365ab34-43df-42ee-bc04-baeba5579717\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.422412 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdxbt\" (UniqueName: \"kubernetes.io/projected/9365ab34-43df-42ee-bc04-baeba5579717-kube-api-access-bdxbt\") pod \"9365ab34-43df-42ee-bc04-baeba5579717\" (UID: \"9365ab34-43df-42ee-bc04-baeba5579717\") " Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.423324 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-utilities" (OuterVolumeSpecName: "utilities") pod "9365ab34-43df-42ee-bc04-baeba5579717" (UID: "9365ab34-43df-42ee-bc04-baeba5579717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.423632 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.439655 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9365ab34-43df-42ee-bc04-baeba5579717-kube-api-access-bdxbt" (OuterVolumeSpecName: "kube-api-access-bdxbt") pod "9365ab34-43df-42ee-bc04-baeba5579717" (UID: "9365ab34-43df-42ee-bc04-baeba5579717"). InnerVolumeSpecName "kube-api-access-bdxbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.478230 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9365ab34-43df-42ee-bc04-baeba5579717" (UID: "9365ab34-43df-42ee-bc04-baeba5579717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.524536 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdxbt\" (UniqueName: \"kubernetes.io/projected/9365ab34-43df-42ee-bc04-baeba5579717-kube-api-access-bdxbt\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.524568 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9365ab34-43df-42ee-bc04-baeba5579717-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.945555 4799 generic.go:334] "Generic (PLEG): container finished" podID="9365ab34-43df-42ee-bc04-baeba5579717" containerID="dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740" exitCode=0 Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.945650 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq5wd" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.945713 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq5wd" event={"ID":"9365ab34-43df-42ee-bc04-baeba5579717","Type":"ContainerDied","Data":"dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740"} Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.945778 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq5wd" event={"ID":"9365ab34-43df-42ee-bc04-baeba5579717","Type":"ContainerDied","Data":"25516d56582951a4797b5957b0746f1e1d95111f53fa8d8231b25a848973291a"} Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.945809 4799 scope.go:117] "RemoveContainer" containerID="dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.958676 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2587j"] Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.959380 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2587j" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="registry-server" containerID="cri-o://7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451" gracePeriod=2 Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.977869 4799 scope.go:117] "RemoveContainer" containerID="2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb" Mar 19 20:09:03 crc kubenswrapper[4799]: I0319 20:09:03.995477 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq5wd"] Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.010605 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sq5wd"] Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.025837 4799 scope.go:117] "RemoveContainer" containerID="7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.137865 4799 scope.go:117] "RemoveContainer" containerID="dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740" Mar 19 20:09:04 crc kubenswrapper[4799]: E0319 20:09:04.139750 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740\": container with ID starting with dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740 not found: ID does not exist" containerID="dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.139798 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740"} err="failed to get container status \"dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740\": rpc error: code = NotFound desc = could not find container \"dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740\": container with ID starting with dac6b5c1544069ee7d08362a8ec3e2df48ccc77c49a0719239289df2bf63b740 not found: ID does not exist" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.139834 4799 scope.go:117] "RemoveContainer" containerID="2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb" Mar 19 20:09:04 crc kubenswrapper[4799]: E0319 20:09:04.140249 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb\": container with ID starting with 2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb not found: ID does not exist" containerID="2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.140274 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb"} err="failed to get container status \"2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb\": rpc error: code = NotFound desc = could not find container \"2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb\": container with ID starting with 2dc2d3ee9899c522846210043b77f20096c1542fb73ab6a200b73c15bf7978cb not found: ID does not exist" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.140290 4799 scope.go:117] "RemoveContainer" containerID="7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4" Mar 19 20:09:04 crc kubenswrapper[4799]: E0319 20:09:04.140867 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4\": container with ID starting with 7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4 not found: ID does not exist" containerID="7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.140924 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4"} err="failed to get container status \"7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4\": rpc error: code = NotFound desc = could not find container \"7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4\": container with ID starting with 7507906880c5c3617d588cfbad8548399d08e0191996c7de1a5ef6dbf1b1d8a4 not found: ID does not exist" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.468137 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.539533 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-catalog-content\") pod \"af016f47-a261-463b-98e4-710fdb4557bb\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.539666 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97ks6\" (UniqueName: \"kubernetes.io/projected/af016f47-a261-463b-98e4-710fdb4557bb-kube-api-access-97ks6\") pod \"af016f47-a261-463b-98e4-710fdb4557bb\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.539810 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-utilities\") pod \"af016f47-a261-463b-98e4-710fdb4557bb\" (UID: \"af016f47-a261-463b-98e4-710fdb4557bb\") " Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.540586 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-utilities" (OuterVolumeSpecName: "utilities") pod "af016f47-a261-463b-98e4-710fdb4557bb" (UID: "af016f47-a261-463b-98e4-710fdb4557bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.548492 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af016f47-a261-463b-98e4-710fdb4557bb-kube-api-access-97ks6" (OuterVolumeSpecName: "kube-api-access-97ks6") pod "af016f47-a261-463b-98e4-710fdb4557bb" (UID: "af016f47-a261-463b-98e4-710fdb4557bb"). InnerVolumeSpecName "kube-api-access-97ks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.575549 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af016f47-a261-463b-98e4-710fdb4557bb" (UID: "af016f47-a261-463b-98e4-710fdb4557bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.642340 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.642405 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97ks6\" (UniqueName: \"kubernetes.io/projected/af016f47-a261-463b-98e4-710fdb4557bb-kube-api-access-97ks6\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.642423 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af016f47-a261-463b-98e4-710fdb4557bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.958149 4799 generic.go:334] "Generic (PLEG): container finished" podID="af016f47-a261-463b-98e4-710fdb4557bb" containerID="7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451" exitCode=0 Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.958219 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2587j" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.958325 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2587j" event={"ID":"af016f47-a261-463b-98e4-710fdb4557bb","Type":"ContainerDied","Data":"7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451"} Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.958419 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2587j" event={"ID":"af016f47-a261-463b-98e4-710fdb4557bb","Type":"ContainerDied","Data":"b5186b2ecd3795166c9ed949a997550c688cc125e71aa8ce53be0a41fd01837f"} Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.958465 4799 scope.go:117] "RemoveContainer" containerID="7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451" Mar 19 20:09:04 crc kubenswrapper[4799]: I0319 20:09:04.983762 4799 scope.go:117] "RemoveContainer" containerID="e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.014925 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2587j"] Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.020594 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2587j"] Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.022593 4799 scope.go:117] "RemoveContainer" containerID="2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.041412 4799 scope.go:117] "RemoveContainer" containerID="7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451" Mar 19 20:09:05 crc kubenswrapper[4799]: E0319 20:09:05.041878 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451\": container with ID starting with 7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451 not found: ID does not exist" containerID="7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.041985 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451"} err="failed to get container status \"7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451\": rpc error: code = NotFound desc = could not find container \"7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451\": container with ID starting with 7f2ab2abda77d39d3aff89c3e1a8653b4d01927cde03ac82e6169157ed227451 not found: ID does not exist" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.042029 4799 scope.go:117] "RemoveContainer" containerID="e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba" Mar 19 20:09:05 crc kubenswrapper[4799]: E0319 20:09:05.042730 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba\": container with ID starting with e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba not found: ID does not exist" containerID="e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.042766 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba"} err="failed to get container status \"e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba\": rpc error: code = NotFound desc = could not find container \"e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba\": container with ID starting with e614dd1a10cf8d7c68b32d15409da71755b13545bdb24c448e9ef6af196f8dba not found: ID does not exist" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.042790 4799 scope.go:117] "RemoveContainer" containerID="2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8" Mar 19 20:09:05 crc kubenswrapper[4799]: E0319 20:09:05.043163 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8\": container with ID starting with 2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8 not found: ID does not exist" containerID="2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.043207 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8"} err="failed to get container status \"2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8\": rpc error: code = NotFound desc = could not find container \"2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8\": container with ID starting with 2af57991575689831af6deb399831ca25313bbb803bca7967d2f38972bf250c8 not found: ID does not exist" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.128225 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9365ab34-43df-42ee-bc04-baeba5579717" path="/var/lib/kubelet/pods/9365ab34-43df-42ee-bc04-baeba5579717/volumes" Mar 19 20:09:05 crc kubenswrapper[4799]: I0319 20:09:05.129411 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af016f47-a261-463b-98e4-710fdb4557bb" path="/var/lib/kubelet/pods/af016f47-a261-463b-98e4-710fdb4557bb/volumes" Mar 19 20:09:07 crc kubenswrapper[4799]: I0319 20:09:07.682478 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" podUID="274f0b01-a045-405f-80e8-f278a87a97ce" containerName="oauth-openshift" containerID="cri-o://a0d059282d766b14e230b18f4a9dfb246a22fd8b2ecf2ae602ecaa0332ee3b55" gracePeriod=15 Mar 19 20:09:07 crc kubenswrapper[4799]: I0319 20:09:07.990208 4799 generic.go:334] "Generic (PLEG): container finished" podID="274f0b01-a045-405f-80e8-f278a87a97ce" containerID="a0d059282d766b14e230b18f4a9dfb246a22fd8b2ecf2ae602ecaa0332ee3b55" exitCode=0 Mar 19 20:09:07 crc kubenswrapper[4799]: I0319 20:09:07.990264 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" event={"ID":"274f0b01-a045-405f-80e8-f278a87a97ce","Type":"ContainerDied","Data":"a0d059282d766b14e230b18f4a9dfb246a22fd8b2ecf2ae602ecaa0332ee3b55"} Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.229537 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302514 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-idp-0-file-data\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302581 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-cliconfig\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302641 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-audit-policies\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302664 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-ocp-branding-template\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302688 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-session\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302707 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274f0b01-a045-405f-80e8-f278a87a97ce-audit-dir\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302767 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-provider-selection\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302803 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-error\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302833 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-router-certs\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302862 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcw79\" (UniqueName: \"kubernetes.io/projected/274f0b01-a045-405f-80e8-f278a87a97ce-kube-api-access-pcw79\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302886 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-serving-cert\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302919 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-login\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302941 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-service-ca\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.302963 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-trusted-ca-bundle\") pod \"274f0b01-a045-405f-80e8-f278a87a97ce\" (UID: \"274f0b01-a045-405f-80e8-f278a87a97ce\") " Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.304361 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/274f0b01-a045-405f-80e8-f278a87a97ce-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.305220 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.305285 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.305315 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.306809 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.314905 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.315178 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274f0b01-a045-405f-80e8-f278a87a97ce-kube-api-access-pcw79" (OuterVolumeSpecName: "kube-api-access-pcw79") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "kube-api-access-pcw79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.315585 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.316130 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.317066 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.317419 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.317520 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.317756 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.318827 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "274f0b01-a045-405f-80e8-f278a87a97ce" (UID: "274f0b01-a045-405f-80e8-f278a87a97ce"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.404992 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405045 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405064 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405081 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcw79\" (UniqueName: \"kubernetes.io/projected/274f0b01-a045-405f-80e8-f278a87a97ce-kube-api-access-pcw79\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405101 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405116 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405132 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405149 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405165 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405180 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405198 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405215 4799 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/274f0b01-a045-405f-80e8-f278a87a97ce-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405230 4799 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/274f0b01-a045-405f-80e8-f278a87a97ce-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.405245 4799 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274f0b01-a045-405f-80e8-f278a87a97ce-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:08 crc kubenswrapper[4799]: I0319 20:09:08.672682 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.000782 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" event={"ID":"274f0b01-a045-405f-80e8-f278a87a97ce","Type":"ContainerDied","Data":"8f5f8fc4669ded38bd94f9eb9a681c6174c92d6494233e45944795ce11d1f905"} Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.000887 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-j5t7c" Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.001415 4799 scope.go:117] "RemoveContainer" containerID="a0d059282d766b14e230b18f4a9dfb246a22fd8b2ecf2ae602ecaa0332ee3b55" Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.043062 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.056520 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j5t7c"] Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.064208 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-j5t7c"] Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.125242 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274f0b01-a045-405f-80e8-f278a87a97ce" path="/var/lib/kubelet/pods/274f0b01-a045-405f-80e8-f278a87a97ce/volumes" Mar 19 20:09:09 crc kubenswrapper[4799]: I0319 20:09:09.758811 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgtxw"] Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.009149 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fgtxw" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="registry-server" containerID="cri-o://860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3" gracePeriod=2 Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.669081 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.733827 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-utilities\") pod \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.733902 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-catalog-content\") pod \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.733929 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq7t2\" (UniqueName: \"kubernetes.io/projected/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-kube-api-access-zq7t2\") pod \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\" (UID: \"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202\") " Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.735685 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-utilities" (OuterVolumeSpecName: "utilities") pod "4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" (UID: "4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.741626 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-kube-api-access-zq7t2" (OuterVolumeSpecName: "kube-api-access-zq7t2") pod "4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" (UID: "4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202"). InnerVolumeSpecName "kube-api-access-zq7t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.799809 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.808433 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.830180 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" (UID: "4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.835784 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.835827 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq7t2\" (UniqueName: \"kubernetes.io/projected/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-kube-api-access-zq7t2\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:10 crc kubenswrapper[4799]: I0319 20:09:10.835844 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.020921 4799 generic.go:334] "Generic (PLEG): container finished" podID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerID="860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3" exitCode=0 Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.021026 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgtxw" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.021048 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerDied","Data":"860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3"} Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.022485 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgtxw" event={"ID":"4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202","Type":"ContainerDied","Data":"fa157baccf7919a8da3e176ae4b43a5994f28542a2387ed33ec57050a326defb"} Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.022535 4799 scope.go:117] "RemoveContainer" containerID="860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.041514 4799 scope.go:117] "RemoveContainer" containerID="288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.071719 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgtxw"] Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.081051 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fgtxw"] Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.083522 4799 scope.go:117] "RemoveContainer" containerID="fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.101973 4799 scope.go:117] "RemoveContainer" containerID="860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3" Mar 19 20:09:11 crc kubenswrapper[4799]: E0319 20:09:11.102281 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3\": container with ID starting with 860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3 not found: ID does not exist" containerID="860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.102319 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3"} err="failed to get container status \"860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3\": rpc error: code = NotFound desc = could not find container \"860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3\": container with ID starting with 860e2ed7b42413b738b23ac171918d259a580f4a969f4adb75864ee908fdbef3 not found: ID does not exist" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.102356 4799 scope.go:117] "RemoveContainer" containerID="288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459" Mar 19 20:09:11 crc kubenswrapper[4799]: E0319 20:09:11.102842 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459\": container with ID starting with 288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459 not found: ID does not exist" containerID="288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.102909 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459"} err="failed to get container status \"288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459\": rpc error: code = NotFound desc = could not find container \"288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459\": container with ID starting with 288e348422b8546bd8e9231cc496d9380869ea14e4947c5d8a55526283ff7459 not found: ID does not exist" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.102955 4799 scope.go:117] "RemoveContainer" containerID="fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29" Mar 19 20:09:11 crc kubenswrapper[4799]: E0319 20:09:11.103319 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29\": container with ID starting with fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29 not found: ID does not exist" containerID="fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.103358 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29"} err="failed to get container status \"fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29\": rpc error: code = NotFound desc = could not find container \"fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29\": container with ID starting with fa0ca91503214e481e9c25b4e82e91386d933d4d926f024c0cd2fd28ceffca29 not found: ID does not exist" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.125786 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" path="/var/lib/kubelet/pods/4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202/volumes" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.597519 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.651214 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.927023 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:09:11 crc kubenswrapper[4799]: I0319 20:09:11.978893 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.457419 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr"] Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458218 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="extract-utilities" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458254 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="extract-utilities" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458280 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458297 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458330 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="extract-utilities" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458347 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="extract-utilities" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458374 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="extract-content" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458428 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="extract-content" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458452 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="extract-utilities" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458468 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="extract-utilities" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458491 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="extract-content" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458508 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="extract-content" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458535 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="extract-content" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458551 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="extract-content" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458574 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458589 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458617 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458632 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: E0319 20:09:12.458655 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274f0b01-a045-405f-80e8-f278a87a97ce" containerName="oauth-openshift" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458670 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="274f0b01-a045-405f-80e8-f278a87a97ce" containerName="oauth-openshift" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458901 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9365ab34-43df-42ee-bc04-baeba5579717" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458933 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="af016f47-a261-463b-98e4-710fdb4557bb" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458964 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a49ea9c-1ab1-4e6d-b8e5-cf879b31d202" containerName="registry-server" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.458990 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="274f0b01-a045-405f-80e8-f278a87a97ce" containerName="oauth-openshift" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.460039 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.466748 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.466940 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.467254 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.467344 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.467377 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.467707 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.467957 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.468045 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.468120 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.468786 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.470154 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.471996 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.490255 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.492678 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.495667 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr"] Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.501158 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556559 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556668 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556760 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556811 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556837 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556867 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-audit-policies\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556895 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w29h6\" (UniqueName: \"kubernetes.io/projected/6122e822-f33a-4da5-bc98-2c4e94c02ff4-kube-api-access-w29h6\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556926 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556969 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.556999 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.557020 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.557169 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6122e822-f33a-4da5-bc98-2c4e94c02ff4-audit-dir\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.557237 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.557262 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657660 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657712 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657752 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657780 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657808 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657830 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657851 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657877 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-audit-policies\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657903 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w29h6\" (UniqueName: \"kubernetes.io/projected/6122e822-f33a-4da5-bc98-2c4e94c02ff4-kube-api-access-w29h6\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.657935 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.658143 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.658172 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.658196 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.658227 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6122e822-f33a-4da5-bc98-2c4e94c02ff4-audit-dir\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.658292 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6122e822-f33a-4da5-bc98-2c4e94c02ff4-audit-dir\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.659480 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.659590 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-audit-policies\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.659916 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.660288 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.664708 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.664792 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.665157 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.666046 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.666533 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.667436 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.669987 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.672181 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6122e822-f33a-4da5-bc98-2c4e94c02ff4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.691888 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w29h6\" (UniqueName: \"kubernetes.io/projected/6122e822-f33a-4da5-bc98-2c4e94c02ff4-kube-api-access-w29h6\") pod \"oauth-openshift-7fb5d9b995-5ldzr\" (UID: \"6122e822-f33a-4da5-bc98-2c4e94c02ff4\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:12 crc kubenswrapper[4799]: I0319 20:09:12.790781 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:13 crc kubenswrapper[4799]: I0319 20:09:13.233343 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr"] Mar 19 20:09:13 crc kubenswrapper[4799]: I0319 20:09:13.955038 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q5qgk"] Mar 19 20:09:13 crc kubenswrapper[4799]: I0319 20:09:13.955302 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q5qgk" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="registry-server" containerID="cri-o://717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4" gracePeriod=2 Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.041309 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" event={"ID":"6122e822-f33a-4da5-bc98-2c4e94c02ff4","Type":"ContainerStarted","Data":"deaa562e470254ae0bfb98f28f37eac6ca884887dc0f210c1178766c6e8958f4"} Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.041361 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" event={"ID":"6122e822-f33a-4da5-bc98-2c4e94c02ff4","Type":"ContainerStarted","Data":"e5f9f0862d768f73e0610f1cbe6c5dd1d384ef60c0d5fb4295090eed5c83f85e"} Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.041571 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.074844 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" podStartSLOduration=32.074823041 podStartE2EDuration="32.074823041s" podCreationTimestamp="2026-03-19 20:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:09:14.073794933 +0000 UTC m=+231.679748015" watchObservedRunningTime="2026-03-19 20:09:14.074823041 +0000 UTC m=+231.680776123" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.101696 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7fb5d9b995-5ldzr" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.537722 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.583406 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-utilities\") pod \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.583475 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdqml\" (UniqueName: \"kubernetes.io/projected/9837fdbc-dcc3-4ba4-a168-a5ec29496871-kube-api-access-wdqml\") pod \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.583511 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-catalog-content\") pod \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\" (UID: \"9837fdbc-dcc3-4ba4-a168-a5ec29496871\") " Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.584694 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-utilities" (OuterVolumeSpecName: "utilities") pod "9837fdbc-dcc3-4ba4-a168-a5ec29496871" (UID: "9837fdbc-dcc3-4ba4-a168-a5ec29496871"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.589605 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9837fdbc-dcc3-4ba4-a168-a5ec29496871-kube-api-access-wdqml" (OuterVolumeSpecName: "kube-api-access-wdqml") pod "9837fdbc-dcc3-4ba4-a168-a5ec29496871" (UID: "9837fdbc-dcc3-4ba4-a168-a5ec29496871"). InnerVolumeSpecName "kube-api-access-wdqml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.685699 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.685785 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdqml\" (UniqueName: \"kubernetes.io/projected/9837fdbc-dcc3-4ba4-a168-a5ec29496871-kube-api-access-wdqml\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.716062 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9837fdbc-dcc3-4ba4-a168-a5ec29496871" (UID: "9837fdbc-dcc3-4ba4-a168-a5ec29496871"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:14 crc kubenswrapper[4799]: I0319 20:09:14.787345 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837fdbc-dcc3-4ba4-a168-a5ec29496871-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.051757 4799 generic.go:334] "Generic (PLEG): container finished" podID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerID="717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4" exitCode=0 Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.051866 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5qgk" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.051891 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerDied","Data":"717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4"} Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.051964 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5qgk" event={"ID":"9837fdbc-dcc3-4ba4-a168-a5ec29496871","Type":"ContainerDied","Data":"bfc90906ba0d9b38c6b63c04dfb37972918c55fd6fff4346fc0c8315b26f0694"} Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.051996 4799 scope.go:117] "RemoveContainer" containerID="717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.078117 4799 scope.go:117] "RemoveContainer" containerID="8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.124142 4799 scope.go:117] "RemoveContainer" containerID="ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.131136 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q5qgk"] Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.131208 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q5qgk"] Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.147597 4799 scope.go:117] "RemoveContainer" containerID="717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4" Mar 19 20:09:15 crc kubenswrapper[4799]: E0319 20:09:15.152625 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4\": container with ID starting with 717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4 not found: ID does not exist" containerID="717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.152671 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4"} err="failed to get container status \"717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4\": rpc error: code = NotFound desc = could not find container \"717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4\": container with ID starting with 717adf371d6a9dd48772b9730d0755a497e96788412594288ac6d95659bbb5f4 not found: ID does not exist" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.152717 4799 scope.go:117] "RemoveContainer" containerID="8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0" Mar 19 20:09:15 crc kubenswrapper[4799]: E0319 20:09:15.153318 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0\": container with ID starting with 8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0 not found: ID does not exist" containerID="8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.153432 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0"} err="failed to get container status \"8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0\": rpc error: code = NotFound desc = could not find container \"8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0\": container with ID starting with 8c3182ffeeb2b03fe8edbc86bd02d372a9995c329b1c09d3bffa1a6f1aa75ca0 not found: ID does not exist" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.153489 4799 scope.go:117] "RemoveContainer" containerID="ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083" Mar 19 20:09:15 crc kubenswrapper[4799]: E0319 20:09:15.153869 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083\": container with ID starting with ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083 not found: ID does not exist" containerID="ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083" Mar 19 20:09:15 crc kubenswrapper[4799]: I0319 20:09:15.153915 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083"} err="failed to get container status \"ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083\": rpc error: code = NotFound desc = could not find container \"ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083\": container with ID starting with ff51c1af188818c9fffbb8bbd1aa99d0779177909fefb8d67db994d0e850d083 not found: ID does not exist" Mar 19 20:09:17 crc kubenswrapper[4799]: I0319 20:09:17.127099 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" path="/var/lib/kubelet/pods/9837fdbc-dcc3-4ba4-a168-a5ec29496871/volumes" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.185330 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6855947fbf-q8zxk"] Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.185581 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" podUID="b456a663-a0de-426b-9f78-69f04d576342" containerName="controller-manager" containerID="cri-o://7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998" gracePeriod=30 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.277628 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx"] Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.278072 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" containerName="route-controller-manager" containerID="cri-o://0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494" gracePeriod=30 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.496774 4799 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.497050 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="extract-utilities" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497063 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="extract-utilities" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.497072 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="registry-server" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497079 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="registry-server" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.497088 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="extract-content" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497097 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="extract-content" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497205 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9837fdbc-dcc3-4ba4-a168-a5ec29496871" containerName="registry-server" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497607 4799 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497715 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497956 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9" gracePeriod=15 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497981 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177" gracePeriod=15 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.497998 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901" gracePeriod=15 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498064 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a" gracePeriod=15 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498070 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c" gracePeriod=15 Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498490 4799 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498635 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498647 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498657 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498664 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498671 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498678 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498689 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498696 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498703 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498710 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498719 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498727 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498735 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498742 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498749 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498757 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.498770 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498779 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498898 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498912 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498921 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498929 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498937 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498947 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.498956 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.499085 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.499097 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: E0319 20:09:19.499111 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.499119 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.499216 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.499231 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.499421 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598188 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598248 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598302 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598341 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598403 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598427 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598461 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.598481 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.671221 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.671894 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.672327 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699039 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699079 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699116 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699135 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699151 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699174 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699173 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699193 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699180 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699218 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699221 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699245 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699273 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699245 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.699315 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.768300 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.768965 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.769475 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.769835 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.800503 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-proxy-ca-bundles\") pod \"b456a663-a0de-426b-9f78-69f04d576342\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.800547 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6559\" (UniqueName: \"kubernetes.io/projected/b456a663-a0de-426b-9f78-69f04d576342-kube-api-access-j6559\") pod \"b456a663-a0de-426b-9f78-69f04d576342\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.800613 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-config\") pod \"b456a663-a0de-426b-9f78-69f04d576342\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.800631 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-client-ca\") pod \"b456a663-a0de-426b-9f78-69f04d576342\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.800709 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b456a663-a0de-426b-9f78-69f04d576342-serving-cert\") pod \"b456a663-a0de-426b-9f78-69f04d576342\" (UID: \"b456a663-a0de-426b-9f78-69f04d576342\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.801622 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-client-ca" (OuterVolumeSpecName: "client-ca") pod "b456a663-a0de-426b-9f78-69f04d576342" (UID: "b456a663-a0de-426b-9f78-69f04d576342"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.801755 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b456a663-a0de-426b-9f78-69f04d576342" (UID: "b456a663-a0de-426b-9f78-69f04d576342"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.801972 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-config" (OuterVolumeSpecName: "config") pod "b456a663-a0de-426b-9f78-69f04d576342" (UID: "b456a663-a0de-426b-9f78-69f04d576342"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.805807 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b456a663-a0de-426b-9f78-69f04d576342-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b456a663-a0de-426b-9f78-69f04d576342" (UID: "b456a663-a0de-426b-9f78-69f04d576342"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.805926 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b456a663-a0de-426b-9f78-69f04d576342-kube-api-access-j6559" (OuterVolumeSpecName: "kube-api-access-j6559") pod "b456a663-a0de-426b-9f78-69f04d576342" (UID: "b456a663-a0de-426b-9f78-69f04d576342"). InnerVolumeSpecName "kube-api-access-j6559". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.901341 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4x8d\" (UniqueName: \"kubernetes.io/projected/7d65d512-1dba-4680-81ba-8e67489154ab-kube-api-access-r4x8d\") pod \"7d65d512-1dba-4680-81ba-8e67489154ab\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.901827 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d65d512-1dba-4680-81ba-8e67489154ab-serving-cert\") pod \"7d65d512-1dba-4680-81ba-8e67489154ab\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.901867 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-config\") pod \"7d65d512-1dba-4680-81ba-8e67489154ab\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.901954 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-client-ca\") pod \"7d65d512-1dba-4680-81ba-8e67489154ab\" (UID: \"7d65d512-1dba-4680-81ba-8e67489154ab\") " Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.902269 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.902299 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.902315 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b456a663-a0de-426b-9f78-69f04d576342-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.902332 4799 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b456a663-a0de-426b-9f78-69f04d576342-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.902349 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6559\" (UniqueName: \"kubernetes.io/projected/b456a663-a0de-426b-9f78-69f04d576342-kube-api-access-j6559\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.903369 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-config" (OuterVolumeSpecName: "config") pod "7d65d512-1dba-4680-81ba-8e67489154ab" (UID: "7d65d512-1dba-4680-81ba-8e67489154ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.903687 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d65d512-1dba-4680-81ba-8e67489154ab" (UID: "7d65d512-1dba-4680-81ba-8e67489154ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.904598 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d65d512-1dba-4680-81ba-8e67489154ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d65d512-1dba-4680-81ba-8e67489154ab" (UID: "7d65d512-1dba-4680-81ba-8e67489154ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:09:19 crc kubenswrapper[4799]: I0319 20:09:19.905669 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d65d512-1dba-4680-81ba-8e67489154ab-kube-api-access-r4x8d" (OuterVolumeSpecName: "kube-api-access-r4x8d") pod "7d65d512-1dba-4680-81ba-8e67489154ab" (UID: "7d65d512-1dba-4680-81ba-8e67489154ab"). InnerVolumeSpecName "kube-api-access-r4x8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.004628 4799 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d65d512-1dba-4680-81ba-8e67489154ab-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.004691 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.004711 4799 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d65d512-1dba-4680-81ba-8e67489154ab-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.004730 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4x8d\" (UniqueName: \"kubernetes.io/projected/7d65d512-1dba-4680-81ba-8e67489154ab-kube-api-access-r4x8d\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.089367 4799 generic.go:334] "Generic (PLEG): container finished" podID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" containerID="7fda262e2ec063127a6ea9b484195d5a3b5c9c2ac225f5902c71aa2df46171f3" exitCode=0 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.089472 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3e94a729-3d36-44ce-8b8f-29e9183ec3ff","Type":"ContainerDied","Data":"7fda262e2ec063127a6ea9b484195d5a3b5c9c2ac225f5902c71aa2df46171f3"} Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.090300 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.090855 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091276 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091609 4799 generic.go:334] "Generic (PLEG): container finished" podID="7d65d512-1dba-4680-81ba-8e67489154ab" containerID="0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494" exitCode=0 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091654 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" event={"ID":"7d65d512-1dba-4680-81ba-8e67489154ab","Type":"ContainerDied","Data":"0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494"} Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091703 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091735 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" event={"ID":"7d65d512-1dba-4680-81ba-8e67489154ab","Type":"ContainerDied","Data":"292192e08e37bc4c4137fdd691580bc715a049cb739696df18b53505ebd50a92"} Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091776 4799 scope.go:117] "RemoveContainer" containerID="0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.091978 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.092667 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.093140 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.093633 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.094062 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.094330 4799 generic.go:334] "Generic (PLEG): container finished" podID="b456a663-a0de-426b-9f78-69f04d576342" containerID="7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998" exitCode=0 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.094414 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" event={"ID":"b456a663-a0de-426b-9f78-69f04d576342","Type":"ContainerDied","Data":"7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998"} Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.094446 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" event={"ID":"b456a663-a0de-426b-9f78-69f04d576342","Type":"ContainerDied","Data":"f91e744c3cf054116f5a93958bbe4844afd618b549373d2f3fdc8f11a2dce362"} Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.094458 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.095767 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.096377 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.096820 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.097553 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.100330 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.102363 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.104030 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177" exitCode=0 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.104079 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c" exitCode=0 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.104096 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901" exitCode=0 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.104114 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a" exitCode=2 Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.120828 4799 scope.go:117] "RemoveContainer" containerID="0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494" Mar 19 20:09:20 crc kubenswrapper[4799]: E0319 20:09:20.123496 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494\": container with ID starting with 0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494 not found: ID does not exist" containerID="0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.123766 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494"} err="failed to get container status \"0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494\": rpc error: code = NotFound desc = could not find container \"0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494\": container with ID starting with 0ca4b319ade8037efc0aaa0cf1e22dd8f974dec4b9d68fa30972e54ae1aff494 not found: ID does not exist" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.123810 4799 scope.go:117] "RemoveContainer" containerID="7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.124933 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.125433 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.125785 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.126453 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.126848 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.127376 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.127984 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.128654 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.143033 4799 scope.go:117] "RemoveContainer" containerID="7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998" Mar 19 20:09:20 crc kubenswrapper[4799]: E0319 20:09:20.143603 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998\": container with ID starting with 7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998 not found: ID does not exist" containerID="7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.143663 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998"} err="failed to get container status \"7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998\": rpc error: code = NotFound desc = could not find container \"7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998\": container with ID starting with 7b1dd416656921e1ab0cfe50ef7ddaa44abe213dc65f2246d10ad8413f34e998 not found: ID does not exist" Mar 19 20:09:20 crc kubenswrapper[4799]: I0319 20:09:20.143689 4799 scope.go:117] "RemoveContainer" containerID="465b6cb122331a4c9f7f1451e9df4212d20288a8409b2c334b255f82aebf0657" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.120966 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.467819 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.468760 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.469188 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.469618 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.629787 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kubelet-dir\") pod \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.629833 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kube-api-access\") pod \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.629903 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-var-lock\") pod \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\" (UID: \"3e94a729-3d36-44ce-8b8f-29e9183ec3ff\") " Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.630281 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-var-lock" (OuterVolumeSpecName: "var-lock") pod "3e94a729-3d36-44ce-8b8f-29e9183ec3ff" (UID: "3e94a729-3d36-44ce-8b8f-29e9183ec3ff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.630317 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3e94a729-3d36-44ce-8b8f-29e9183ec3ff" (UID: "3e94a729-3d36-44ce-8b8f-29e9183ec3ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.653801 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3e94a729-3d36-44ce-8b8f-29e9183ec3ff" (UID: "3e94a729-3d36-44ce-8b8f-29e9183ec3ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.731522 4799 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.731833 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.731846 4799 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3e94a729-3d36-44ce-8b8f-29e9183ec3ff-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.868308 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.869414 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.870211 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.870723 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.871224 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:21 crc kubenswrapper[4799]: I0319 20:09:21.871636 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.035778 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.035863 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.035901 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.035959 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.036031 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.036109 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.036428 4799 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.036483 4799 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.036500 4799 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.145598 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.147027 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9" exitCode=0 Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.147136 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.147150 4799 scope.go:117] "RemoveContainer" containerID="68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.150866 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3e94a729-3d36-44ce-8b8f-29e9183ec3ff","Type":"ContainerDied","Data":"5f7f639cf101f7232883e57690e8ef8809e24024330541fde1bd49d0833281ca"} Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.150921 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f7f639cf101f7232883e57690e8ef8809e24024330541fde1bd49d0833281ca" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.150978 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.174602 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.175155 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.175659 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.176115 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.176174 4799 scope.go:117] "RemoveContainer" containerID="d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.183503 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.183989 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.184641 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.185125 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.195717 4799 scope.go:117] "RemoveContainer" containerID="4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.217291 4799 scope.go:117] "RemoveContainer" containerID="0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.248885 4799 scope.go:117] "RemoveContainer" containerID="38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.272456 4799 scope.go:117] "RemoveContainer" containerID="a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.299418 4799 scope.go:117] "RemoveContainer" containerID="68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177" Mar 19 20:09:22 crc kubenswrapper[4799]: E0319 20:09:22.301344 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177\": container with ID starting with 68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177 not found: ID does not exist" containerID="68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.301497 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177"} err="failed to get container status \"68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177\": rpc error: code = NotFound desc = could not find container \"68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177\": container with ID starting with 68d02fb5bc24a64b0b01b91dda515df540481446ea4daaa7ced6e746d7c9e177 not found: ID does not exist" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.301543 4799 scope.go:117] "RemoveContainer" containerID="d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c" Mar 19 20:09:22 crc kubenswrapper[4799]: E0319 20:09:22.302298 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c\": container with ID starting with d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c not found: ID does not exist" containerID="d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.302436 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c"} err="failed to get container status \"d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c\": rpc error: code = NotFound desc = could not find container \"d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c\": container with ID starting with d108749e74cfd7d56dfaa5a2983a895dbc7399584969f450ed85b9c185bee49c not found: ID does not exist" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.302581 4799 scope.go:117] "RemoveContainer" containerID="4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901" Mar 19 20:09:22 crc kubenswrapper[4799]: E0319 20:09:22.303226 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901\": container with ID starting with 4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901 not found: ID does not exist" containerID="4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.303277 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901"} err="failed to get container status \"4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901\": rpc error: code = NotFound desc = could not find container \"4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901\": container with ID starting with 4e4e2865e865f2bdf5237f2af67060957f6bbde783f240f007fc899b87b29901 not found: ID does not exist" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.303309 4799 scope.go:117] "RemoveContainer" containerID="0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a" Mar 19 20:09:22 crc kubenswrapper[4799]: E0319 20:09:22.304352 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a\": container with ID starting with 0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a not found: ID does not exist" containerID="0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.304431 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a"} err="failed to get container status \"0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a\": rpc error: code = NotFound desc = could not find container \"0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a\": container with ID starting with 0e90e897a7282cc035bdbab5931108483d0cd0debf157d48d15e09554de7fe2a not found: ID does not exist" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.304466 4799 scope.go:117] "RemoveContainer" containerID="38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9" Mar 19 20:09:22 crc kubenswrapper[4799]: E0319 20:09:22.305088 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9\": container with ID starting with 38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9 not found: ID does not exist" containerID="38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.305150 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9"} err="failed to get container status \"38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9\": rpc error: code = NotFound desc = could not find container \"38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9\": container with ID starting with 38883060655d0d43ce5b0c6cfc734dddac3b81de8ad2a9de4ae9b8d3660afac9 not found: ID does not exist" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.305200 4799 scope.go:117] "RemoveContainer" containerID="a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac" Mar 19 20:09:22 crc kubenswrapper[4799]: E0319 20:09:22.305652 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\": container with ID starting with a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac not found: ID does not exist" containerID="a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac" Mar 19 20:09:22 crc kubenswrapper[4799]: I0319 20:09:22.305703 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac"} err="failed to get container status \"a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\": rpc error: code = NotFound desc = could not find container \"a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac\": container with ID starting with a0a8ef36a41f31f4531b73a525ae04375a2e34f36c26f8f14c8e97fa7d05e0ac not found: ID does not exist" Mar 19 20:09:23 crc kubenswrapper[4799]: I0319 20:09:23.118941 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: I0319 20:09:23.119705 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: I0319 20:09:23.120103 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: I0319 20:09:23.121344 4799 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: I0319 20:09:23.130997 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.577617 4799 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.578103 4799 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.578473 4799 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.578815 4799 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.579194 4799 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:23 crc kubenswrapper[4799]: I0319 20:09:23.579241 4799 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.579599 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="200ms" Mar 19 20:09:23 crc kubenswrapper[4799]: E0319 20:09:23.781349 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="400ms" Mar 19 20:09:24 crc kubenswrapper[4799]: E0319 20:09:24.183571 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="800ms" Mar 19 20:09:24 crc kubenswrapper[4799]: E0319 20:09:24.528552 4799 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:24 crc kubenswrapper[4799]: I0319 20:09:24.529727 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:24 crc kubenswrapper[4799]: E0319 20:09:24.563262 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e5700e3b5e8eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:09:24.562520299 +0000 UTC m=+242.168473371,LastTimestamp:2026-03-19 20:09:24.562520299 +0000 UTC m=+242.168473371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:09:24 crc kubenswrapper[4799]: E0319 20:09:24.984367 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="1.6s" Mar 19 20:09:25 crc kubenswrapper[4799]: I0319 20:09:25.175654 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b"} Mar 19 20:09:25 crc kubenswrapper[4799]: I0319 20:09:25.175706 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c69df28ab941bc5b62397dfdecd1dac645741253bd6933500e8d1db8f934e029"} Mar 19 20:09:25 crc kubenswrapper[4799]: I0319 20:09:25.176373 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:25 crc kubenswrapper[4799]: E0319 20:09:25.176451 4799 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:25 crc kubenswrapper[4799]: I0319 20:09:25.176853 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:25 crc kubenswrapper[4799]: I0319 20:09:25.177187 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:26 crc kubenswrapper[4799]: E0319 20:09:26.586431 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="3.2s" Mar 19 20:09:27 crc kubenswrapper[4799]: E0319 20:09:27.146014 4799 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" volumeName="registry-storage" Mar 19 20:09:28 crc kubenswrapper[4799]: I0319 20:09:28.755995 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:09:28 crc kubenswrapper[4799]: I0319 20:09:28.756083 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:09:29 crc kubenswrapper[4799]: E0319 20:09:29.788489 4799 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="6.4s" Mar 19 20:09:32 crc kubenswrapper[4799]: E0319 20:09:32.216919 4799 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e5700e3b5e8eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 20:09:24.562520299 +0000 UTC m=+242.168473371,LastTimestamp:2026-03-19 20:09:24.562520299 +0000 UTC m=+242.168473371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.231609 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.233076 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.233142 4799 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f2c37e72f55493fc298426243c3f3dbb349d51b07a336295c38878065e68c831" exitCode=1 Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.233179 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f2c37e72f55493fc298426243c3f3dbb349d51b07a336295c38878065e68c831"} Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.233671 4799 scope.go:117] "RemoveContainer" containerID="f2c37e72f55493fc298426243c3f3dbb349d51b07a336295c38878065e68c831" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.234574 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.237847 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.238285 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.238815 4799 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.611155 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:09:32 crc kubenswrapper[4799]: I0319 20:09:32.690190 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.122038 4799 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.122693 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.123264 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.123735 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.249086 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.250129 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.250228 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f291e94710e7ed43423a604283923dedd092e4d07d20bb93d64f8d8ab09b51a"} Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.251345 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.252049 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.252650 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:33 crc kubenswrapper[4799]: I0319 20:09:33.253172 4799 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.115598 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.116569 4799 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.117097 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.117566 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.118017 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.139508 4799 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.139554 4799 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:34 crc kubenswrapper[4799]: E0319 20:09:34.140185 4799 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.141074 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:34 crc kubenswrapper[4799]: W0319 20:09:34.171811 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-94dfd55c8bdfb29cf7e2ccb5473be3ee874baf6d3f569013a266082ad2044d91 WatchSource:0}: Error finding container 94dfd55c8bdfb29cf7e2ccb5473be3ee874baf6d3f569013a266082ad2044d91: Status 404 returned error can't find the container with id 94dfd55c8bdfb29cf7e2ccb5473be3ee874baf6d3f569013a266082ad2044d91 Mar 19 20:09:34 crc kubenswrapper[4799]: I0319 20:09:34.261125 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"94dfd55c8bdfb29cf7e2ccb5473be3ee874baf6d3f569013a266082ad2044d91"} Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.270980 4799 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c4401371640ef3b098fcd3784782c9fbbe9212eac5757f8bc4dbbe9758dcf561" exitCode=0 Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.271044 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c4401371640ef3b098fcd3784782c9fbbe9212eac5757f8bc4dbbe9758dcf561"} Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.271464 4799 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.271502 4799 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:35 crc kubenswrapper[4799]: E0319 20:09:35.272212 4799 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.272462 4799 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.273107 4799 status_manager.go:851] "Failed to get status for pod" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.273640 4799 status_manager.go:851] "Failed to get status for pod" podUID="b456a663-a0de-426b-9f78-69f04d576342" pod="openshift-controller-manager/controller-manager-6855947fbf-q8zxk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6855947fbf-q8zxk\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:35 crc kubenswrapper[4799]: I0319 20:09:35.274084 4799 status_manager.go:851] "Failed to get status for pod" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" pod="openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-5c56949c99-l4vtx\": dial tcp 38.102.83.107:6443: connect: connection refused" Mar 19 20:09:36 crc kubenswrapper[4799]: I0319 20:09:36.282785 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"00f4571eb276ab5b188dd5e2f0b598eddc299923fba97e733439700c37791b9f"} Mar 19 20:09:36 crc kubenswrapper[4799]: I0319 20:09:36.283212 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"222745920ae8571e57b1154e898c5657d8f8fcaef668f745d31af45457ec6ef9"} Mar 19 20:09:36 crc kubenswrapper[4799]: I0319 20:09:36.283227 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ed06fba5d00027931e3a41c2aa4074270da5dbac0b56d7cddc4cfca769f346bc"} Mar 19 20:09:37 crc kubenswrapper[4799]: I0319 20:09:37.289296 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4390485a39d0b0448d83c08689b0271a7ed7aaa6f5be8d733f4953df9915723e"} Mar 19 20:09:37 crc kubenswrapper[4799]: I0319 20:09:37.289580 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:37 crc kubenswrapper[4799]: I0319 20:09:37.289589 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"124c324cfe1c14f4b50f9efdd3f846223c820277cca83518f976787951c849f6"} Mar 19 20:09:37 crc kubenswrapper[4799]: I0319 20:09:37.289526 4799 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:37 crc kubenswrapper[4799]: I0319 20:09:37.289606 4799 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:39 crc kubenswrapper[4799]: I0319 20:09:39.141918 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:39 crc kubenswrapper[4799]: I0319 20:09:39.141982 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:39 crc kubenswrapper[4799]: I0319 20:09:39.150258 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.232275 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.238559 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.296560 4799 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.315329 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.315362 4799 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.315402 4799 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.318478 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:09:42 crc kubenswrapper[4799]: I0319 20:09:42.322809 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:43 crc kubenswrapper[4799]: I0319 20:09:43.131091 4799 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="62cd0fdf-3c62-4eaf-ae29-43ffdb18ad81" Mar 19 20:09:43 crc kubenswrapper[4799]: I0319 20:09:43.317954 4799 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:43 crc kubenswrapper[4799]: I0319 20:09:43.317987 4799 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b2940076-1ae1-4544-8060-faba015730bb" Mar 19 20:09:43 crc kubenswrapper[4799]: I0319 20:09:43.322437 4799 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="62cd0fdf-3c62-4eaf-ae29-43ffdb18ad81" Mar 19 20:09:51 crc kubenswrapper[4799]: I0319 20:09:51.755264 4799 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 20:09:51 crc kubenswrapper[4799]: I0319 20:09:51.762942 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c56949c99-l4vtx","openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-6855947fbf-q8zxk"] Mar 19 20:09:51 crc kubenswrapper[4799]: I0319 20:09:51.763057 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 20:09:51 crc kubenswrapper[4799]: I0319 20:09:51.770318 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 20:09:51 crc kubenswrapper[4799]: I0319 20:09:51.796722 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.796690799 podStartE2EDuration="9.796690799s" podCreationTimestamp="2026-03-19 20:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:09:51.791761484 +0000 UTC m=+269.397714646" watchObservedRunningTime="2026-03-19 20:09:51.796690799 +0000 UTC m=+269.402643911" Mar 19 20:09:52 crc kubenswrapper[4799]: I0319 20:09:52.106657 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 20:09:52 crc kubenswrapper[4799]: I0319 20:09:52.538661 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 20:09:52 crc kubenswrapper[4799]: I0319 20:09:52.659973 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.131062 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" path="/var/lib/kubelet/pods/7d65d512-1dba-4680-81ba-8e67489154ab/volumes" Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.133001 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b456a663-a0de-426b-9f78-69f04d576342" path="/var/lib/kubelet/pods/b456a663-a0de-426b-9f78-69f04d576342/volumes" Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.317281 4799 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.354632 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.358750 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.558447 4799 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.559757 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b" gracePeriod=5 Mar 19 20:09:53 crc kubenswrapper[4799]: I0319 20:09:53.627348 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.061450 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.129327 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.170129 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.180617 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.186148 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.260565 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.361223 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.594313 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.658913 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.787898 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 20:09:54 crc kubenswrapper[4799]: I0319 20:09:54.975267 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.065111 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.150880 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.185420 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.312260 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.444173 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.609869 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.648292 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.656071 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.735002 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.827213 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.875723 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 20:09:55 crc kubenswrapper[4799]: I0319 20:09:55.923769 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.067744 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.071098 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.099333 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.099936 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.174361 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.180193 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.275192 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.391005 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.462838 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.473304 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.477129 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.576790 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.629499 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.636676 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.644896 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.672890 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.782220 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.957147 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 20:09:56 crc kubenswrapper[4799]: I0319 20:09:56.968957 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.001777 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.070446 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.084592 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.130994 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.153041 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.328670 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.368683 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.379019 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.470109 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.524720 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.582193 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.604690 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.665645 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.687333 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.901306 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.903735 4799 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 20:09:57 crc kubenswrapper[4799]: I0319 20:09:57.925592 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.049019 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.120128 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.130343 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.171806 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.278657 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.374286 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.390759 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.406561 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.461063 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.529286 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.547208 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.608150 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.631435 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.706870 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.733414 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.756166 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.756240 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.780107 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.783115 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 20:09:58 crc kubenswrapper[4799]: I0319 20:09:58.860111 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.055042 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.157727 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.176332 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.193870 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.193952 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.226830 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.226900 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.226948 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.226995 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.226999 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.227038 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.227098 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.227148 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.227523 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.228219 4799 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.228298 4799 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.228321 4799 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.228370 4799 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.238065 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.284200 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.329980 4799 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.335885 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.385079 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.385182 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.441615 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.445988 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.446076 4799 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b" exitCode=137 Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.446183 4799 scope.go:117] "RemoveContainer" containerID="5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.446197 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.447353 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.473415 4799 scope.go:117] "RemoveContainer" containerID="5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b" Mar 19 20:09:59 crc kubenswrapper[4799]: E0319 20:09:59.474199 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b\": container with ID starting with 5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b not found: ID does not exist" containerID="5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.474295 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b"} err="failed to get container status \"5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b\": rpc error: code = NotFound desc = could not find container \"5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b\": container with ID starting with 5283c339709f826bc129ea90533145d0ca856e54d3e3ec6b23c4aa0cc9dd870b not found: ID does not exist" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.493984 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.505757 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.526116 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.528056 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.552964 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.580061 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.907453 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 20:09:59 crc kubenswrapper[4799]: I0319 20:09:59.958649 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.010524 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.054972 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.081072 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.084934 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.085112 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.102606 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.188278 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.422278 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.424051 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.428750 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.474371 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.494056 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.510894 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.621860 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.623064 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.676796 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.798673 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.800590 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.810599 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.848821 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.859644 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.880492 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.948764 4799 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 20:10:00 crc kubenswrapper[4799]: I0319 20:10:00.976682 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.023554 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.127672 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.271592 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.316736 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.350879 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.469610 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.557473 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.583243 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.602431 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.652177 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.739003 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.751686 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.883762 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.924312 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.947998 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.949882 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.968966 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 20:10:01 crc kubenswrapper[4799]: I0319 20:10:01.985786 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.070190 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.113688 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.138314 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.256919 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.338376 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.416545 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.432291 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.475515 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.563575 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.624538 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.644456 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.644505 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.681705 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.684694 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.714307 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718244 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565850-8c5pj"] Mar 19 20:10:02 crc kubenswrapper[4799]: E0319 20:10:02.718524 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718545 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 20:10:02 crc kubenswrapper[4799]: E0319 20:10:02.718562 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" containerName="route-controller-manager" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718569 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" containerName="route-controller-manager" Mar 19 20:10:02 crc kubenswrapper[4799]: E0319 20:10:02.718585 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b456a663-a0de-426b-9f78-69f04d576342" containerName="controller-manager" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718592 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b456a663-a0de-426b-9f78-69f04d576342" containerName="controller-manager" Mar 19 20:10:02 crc kubenswrapper[4799]: E0319 20:10:02.718601 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" containerName="installer" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718609 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" containerName="installer" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718689 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718702 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d65d512-1dba-4680-81ba-8e67489154ab" containerName="route-controller-manager" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718708 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e94a729-3d36-44ce-8b8f-29e9183ec3ff" containerName="installer" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.718717 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b456a663-a0de-426b-9f78-69f04d576342" containerName="controller-manager" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.719054 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.723849 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.724142 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.724315 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.725296 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64d69869cf-cwtwp"] Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.726311 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.731695 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk"] Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.732986 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.734417 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.734655 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.734744 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.734803 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.738038 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.737680 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.738240 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.742033 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.743136 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.743611 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.744366 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.744967 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.746623 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.769676 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775453 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwz5\" (UniqueName: \"kubernetes.io/projected/c5a1c8e9-87ee-486d-ab69-7a27baee524c-kube-api-access-glwz5\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775502 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkv6c\" (UniqueName: \"kubernetes.io/projected/b6f7ab30-1912-4d48-8d01-050adc5cf64f-kube-api-access-gkv6c\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775522 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a1c8e9-87ee-486d-ab69-7a27baee524c-client-ca\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775541 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkk8q\" (UniqueName: \"kubernetes.io/projected/cd47b02a-a448-4b51-bec1-977f2ebbc4e2-kube-api-access-fkk8q\") pod \"auto-csr-approver-29565850-8c5pj\" (UID: \"cd47b02a-a448-4b51-bec1-977f2ebbc4e2\") " pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775559 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f7ab30-1912-4d48-8d01-050adc5cf64f-serving-cert\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775574 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a1c8e9-87ee-486d-ab69-7a27baee524c-serving-cert\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775614 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-config\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775655 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-client-ca\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775669 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a1c8e9-87ee-486d-ab69-7a27baee524c-config\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.775694 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-proxy-ca-bundles\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.834710 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877134 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-client-ca\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877206 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a1c8e9-87ee-486d-ab69-7a27baee524c-config\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877266 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-proxy-ca-bundles\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877319 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwz5\" (UniqueName: \"kubernetes.io/projected/c5a1c8e9-87ee-486d-ab69-7a27baee524c-kube-api-access-glwz5\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877377 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkv6c\" (UniqueName: \"kubernetes.io/projected/b6f7ab30-1912-4d48-8d01-050adc5cf64f-kube-api-access-gkv6c\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877461 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a1c8e9-87ee-486d-ab69-7a27baee524c-client-ca\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877492 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkk8q\" (UniqueName: \"kubernetes.io/projected/cd47b02a-a448-4b51-bec1-977f2ebbc4e2-kube-api-access-fkk8q\") pod \"auto-csr-approver-29565850-8c5pj\" (UID: \"cd47b02a-a448-4b51-bec1-977f2ebbc4e2\") " pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877527 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f7ab30-1912-4d48-8d01-050adc5cf64f-serving-cert\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877562 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a1c8e9-87ee-486d-ab69-7a27baee524c-serving-cert\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.877632 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-config\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.878157 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-client-ca\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.878491 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-proxy-ca-bundles\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.878608 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a1c8e9-87ee-486d-ab69-7a27baee524c-client-ca\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.878868 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a1c8e9-87ee-486d-ab69-7a27baee524c-config\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.879928 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f7ab30-1912-4d48-8d01-050adc5cf64f-config\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.883145 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.883701 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f7ab30-1912-4d48-8d01-050adc5cf64f-serving-cert\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.885229 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a1c8e9-87ee-486d-ab69-7a27baee524c-serving-cert\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.893884 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkk8q\" (UniqueName: \"kubernetes.io/projected/cd47b02a-a448-4b51-bec1-977f2ebbc4e2-kube-api-access-fkk8q\") pod \"auto-csr-approver-29565850-8c5pj\" (UID: \"cd47b02a-a448-4b51-bec1-977f2ebbc4e2\") " pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.901108 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkv6c\" (UniqueName: \"kubernetes.io/projected/b6f7ab30-1912-4d48-8d01-050adc5cf64f-kube-api-access-gkv6c\") pod \"controller-manager-64d69869cf-cwtwp\" (UID: \"b6f7ab30-1912-4d48-8d01-050adc5cf64f\") " pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.914753 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwz5\" (UniqueName: \"kubernetes.io/projected/c5a1c8e9-87ee-486d-ab69-7a27baee524c-kube-api-access-glwz5\") pod \"route-controller-manager-6cdd568d8f-mcpnk\" (UID: \"c5a1c8e9-87ee-486d-ab69-7a27baee524c\") " pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.953422 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 20:10:02 crc kubenswrapper[4799]: I0319 20:10:02.997831 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.030112 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.045376 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.057034 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.064414 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.080012 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.316176 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.395857 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.486333 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.496991 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.516225 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.554763 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.577658 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.608336 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.660462 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.687812 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.751755 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.793542 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 20:10:03 crc kubenswrapper[4799]: I0319 20:10:03.871775 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.030451 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-8c5pj"] Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.035344 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d69869cf-cwtwp"] Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.046988 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.047850 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk"] Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.087127 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.099205 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.275640 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.385825 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.408999 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.467368 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.649819 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.652154 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.680044 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.718037 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.738971 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.800404 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 20:10:04 crc kubenswrapper[4799]: I0319 20:10:04.866307 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.005723 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.044911 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.068496 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.092423 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.109222 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.150134 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.242271 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.388463 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.557135 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.570031 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.582277 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.592709 4799 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.664825 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.729522 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 20:10:05 crc kubenswrapper[4799]: I0319 20:10:05.954255 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d69869cf-cwtwp"] Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.053706 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.084242 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-8c5pj"] Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.094978 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.099288 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk"] Mar 19 20:10:06 crc kubenswrapper[4799]: W0319 20:10:06.117297 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5a1c8e9_87ee_486d_ab69_7a27baee524c.slice/crio-e1f4c9e15d6bc64aebe947c7dded7a354e2a7e2977aea7b082176f3b531af3c4 WatchSource:0}: Error finding container e1f4c9e15d6bc64aebe947c7dded7a354e2a7e2977aea7b082176f3b531af3c4: Status 404 returned error can't find the container with id e1f4c9e15d6bc64aebe947c7dded7a354e2a7e2977aea7b082176f3b531af3c4 Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.224104 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.490422 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.493369 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" event={"ID":"b6f7ab30-1912-4d48-8d01-050adc5cf64f","Type":"ContainerStarted","Data":"9c81d4a2491d65f4af006276eb2eee7e9b6889b6a6bf3996bec1291a66f6d8e4"} Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.493588 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.493659 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" event={"ID":"b6f7ab30-1912-4d48-8d01-050adc5cf64f","Type":"ContainerStarted","Data":"3027855d0fe56e49cca35ccf9da403d668c8a22cccba79df2845f868ddec3cb0"} Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.494465 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" event={"ID":"cd47b02a-a448-4b51-bec1-977f2ebbc4e2","Type":"ContainerStarted","Data":"719db45a5b16c990ca52f5e56d693c88e12314b84aa47fa98ea528079e0ef924"} Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.495908 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" event={"ID":"c5a1c8e9-87ee-486d-ab69-7a27baee524c","Type":"ContainerStarted","Data":"9e548d1940c96ea6f819f98ccb1bfc479c7d54232322c3aa9b5811577021ac79"} Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.495943 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" event={"ID":"c5a1c8e9-87ee-486d-ab69-7a27baee524c","Type":"ContainerStarted","Data":"e1f4c9e15d6bc64aebe947c7dded7a354e2a7e2977aea7b082176f3b531af3c4"} Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.496148 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.497469 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.513629 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64d69869cf-cwtwp" podStartSLOduration=47.513613998 podStartE2EDuration="47.513613998s" podCreationTimestamp="2026-03-19 20:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:10:06.511593863 +0000 UTC m=+284.117546935" watchObservedRunningTime="2026-03-19 20:10:06.513613998 +0000 UTC m=+284.119567070" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.514183 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.546510 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" podStartSLOduration=47.546483856 podStartE2EDuration="47.546483856s" podCreationTimestamp="2026-03-19 20:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:10:06.540805271 +0000 UTC m=+284.146758343" watchObservedRunningTime="2026-03-19 20:10:06.546483856 +0000 UTC m=+284.152436928" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.718882 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cdd568d8f-mcpnk" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.728212 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 20:10:06 crc kubenswrapper[4799]: I0319 20:10:06.805800 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.075358 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.078153 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.100278 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.255933 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.276329 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.368592 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.640547 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.676894 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.742080 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 20:10:07 crc kubenswrapper[4799]: I0319 20:10:07.877990 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.060167 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.100532 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.217723 4799 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.322826 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.346977 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.510455 4799 generic.go:334] "Generic (PLEG): container finished" podID="cd47b02a-a448-4b51-bec1-977f2ebbc4e2" containerID="22b59f0344f3d7f5047b20c8a7d5f8d6917a3385f514e25a0070a94f8f932194" exitCode=0 Mar 19 20:10:08 crc kubenswrapper[4799]: I0319 20:10:08.510536 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" event={"ID":"cd47b02a-a448-4b51-bec1-977f2ebbc4e2","Type":"ContainerDied","Data":"22b59f0344f3d7f5047b20c8a7d5f8d6917a3385f514e25a0070a94f8f932194"} Mar 19 20:10:09 crc kubenswrapper[4799]: I0319 20:10:09.877884 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:09 crc kubenswrapper[4799]: I0319 20:10:09.985312 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkk8q\" (UniqueName: \"kubernetes.io/projected/cd47b02a-a448-4b51-bec1-977f2ebbc4e2-kube-api-access-fkk8q\") pod \"cd47b02a-a448-4b51-bec1-977f2ebbc4e2\" (UID: \"cd47b02a-a448-4b51-bec1-977f2ebbc4e2\") " Mar 19 20:10:09 crc kubenswrapper[4799]: I0319 20:10:09.994038 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd47b02a-a448-4b51-bec1-977f2ebbc4e2-kube-api-access-fkk8q" (OuterVolumeSpecName: "kube-api-access-fkk8q") pod "cd47b02a-a448-4b51-bec1-977f2ebbc4e2" (UID: "cd47b02a-a448-4b51-bec1-977f2ebbc4e2"). InnerVolumeSpecName "kube-api-access-fkk8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:10:10 crc kubenswrapper[4799]: I0319 20:10:10.088906 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkk8q\" (UniqueName: \"kubernetes.io/projected/cd47b02a-a448-4b51-bec1-977f2ebbc4e2-kube-api-access-fkk8q\") on node \"crc\" DevicePath \"\"" Mar 19 20:10:10 crc kubenswrapper[4799]: I0319 20:10:10.527306 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" event={"ID":"cd47b02a-a448-4b51-bec1-977f2ebbc4e2","Type":"ContainerDied","Data":"719db45a5b16c990ca52f5e56d693c88e12314b84aa47fa98ea528079e0ef924"} Mar 19 20:10:10 crc kubenswrapper[4799]: I0319 20:10:10.527418 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719db45a5b16c990ca52f5e56d693c88e12314b84aa47fa98ea528079e0ef924" Mar 19 20:10:10 crc kubenswrapper[4799]: I0319 20:10:10.527446 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-8c5pj" Mar 19 20:10:17 crc kubenswrapper[4799]: I0319 20:10:17.472927 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 20:10:22 crc kubenswrapper[4799]: I0319 20:10:22.754100 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 20:10:27 crc kubenswrapper[4799]: I0319 20:10:27.506457 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 20:10:27 crc kubenswrapper[4799]: I0319 20:10:27.614953 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 20:10:28 crc kubenswrapper[4799]: I0319 20:10:28.756628 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:10:28 crc kubenswrapper[4799]: I0319 20:10:28.756762 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:10:28 crc kubenswrapper[4799]: I0319 20:10:28.756924 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:10:28 crc kubenswrapper[4799]: I0319 20:10:28.757697 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:10:28 crc kubenswrapper[4799]: I0319 20:10:28.757753 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427" gracePeriod=600 Mar 19 20:10:29 crc kubenswrapper[4799]: I0319 20:10:29.670127 4799 generic.go:334] "Generic (PLEG): container finished" podID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerID="cd3deca1b9f8858ef86103f41d40b0d37ff03a0f5444ba684060abe85cac4b48" exitCode=0 Mar 19 20:10:29 crc kubenswrapper[4799]: I0319 20:10:29.670251 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" event={"ID":"e7a739cc-cb77-45dc-9811-661046ccf05b","Type":"ContainerDied","Data":"cd3deca1b9f8858ef86103f41d40b0d37ff03a0f5444ba684060abe85cac4b48"} Mar 19 20:10:29 crc kubenswrapper[4799]: I0319 20:10:29.672038 4799 scope.go:117] "RemoveContainer" containerID="cd3deca1b9f8858ef86103f41d40b0d37ff03a0f5444ba684060abe85cac4b48" Mar 19 20:10:29 crc kubenswrapper[4799]: I0319 20:10:29.676310 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427" exitCode=0 Mar 19 20:10:29 crc kubenswrapper[4799]: I0319 20:10:29.676362 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427"} Mar 19 20:10:29 crc kubenswrapper[4799]: I0319 20:10:29.676417 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2"} Mar 19 20:10:30 crc kubenswrapper[4799]: I0319 20:10:30.682612 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" event={"ID":"e7a739cc-cb77-45dc-9811-661046ccf05b","Type":"ContainerStarted","Data":"389a918cd6480682b65d0827bdf2697dbc887e13162cc5866f65d190a4466074"} Mar 19 20:10:30 crc kubenswrapper[4799]: I0319 20:10:30.683425 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:10:30 crc kubenswrapper[4799]: I0319 20:10:30.688542 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:10:33 crc kubenswrapper[4799]: I0319 20:10:33.298071 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 20:10:36 crc kubenswrapper[4799]: I0319 20:10:36.025153 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 20:10:36 crc kubenswrapper[4799]: I0319 20:10:36.374940 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 20:10:36 crc kubenswrapper[4799]: I0319 20:10:36.927862 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 20:10:38 crc kubenswrapper[4799]: I0319 20:10:38.757100 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 20:10:41 crc kubenswrapper[4799]: I0319 20:10:41.759103 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 20:10:49 crc kubenswrapper[4799]: I0319 20:10:49.119772 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.613860 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-grlqs"] Mar 19 20:11:21 crc kubenswrapper[4799]: E0319 20:11:21.614679 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47b02a-a448-4b51-bec1-977f2ebbc4e2" containerName="oc" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.614695 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47b02a-a448-4b51-bec1-977f2ebbc4e2" containerName="oc" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.614829 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd47b02a-a448-4b51-bec1-977f2ebbc4e2" containerName="oc" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.615224 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.627782 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-grlqs"] Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.705667 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrnv\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-kube-api-access-gqrnv\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.705704 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-bound-sa-token\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.705735 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-registry-tls\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.705752 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c8696404-9e41-478d-86ec-0382990a925f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.705956 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c8696404-9e41-478d-86ec-0382990a925f-registry-certificates\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.706018 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8696404-9e41-478d-86ec-0382990a925f-trusted-ca\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.706054 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c8696404-9e41-478d-86ec-0382990a925f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.706081 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.725018 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.807868 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-registry-tls\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.808273 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c8696404-9e41-478d-86ec-0382990a925f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.808315 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c8696404-9e41-478d-86ec-0382990a925f-registry-certificates\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.808340 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8696404-9e41-478d-86ec-0382990a925f-trusted-ca\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.808364 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c8696404-9e41-478d-86ec-0382990a925f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.808445 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrnv\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-kube-api-access-gqrnv\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.808469 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-bound-sa-token\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.809116 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c8696404-9e41-478d-86ec-0382990a925f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.809759 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8696404-9e41-478d-86ec-0382990a925f-trusted-ca\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.809895 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c8696404-9e41-478d-86ec-0382990a925f-registry-certificates\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.813658 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c8696404-9e41-478d-86ec-0382990a925f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.814016 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-registry-tls\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.822590 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrnv\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-kube-api-access-gqrnv\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.828915 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8696404-9e41-478d-86ec-0382990a925f-bound-sa-token\") pod \"image-registry-66df7c8f76-grlqs\" (UID: \"c8696404-9e41-478d-86ec-0382990a925f\") " pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:21 crc kubenswrapper[4799]: I0319 20:11:21.939416 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:22 crc kubenswrapper[4799]: I0319 20:11:22.381134 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-grlqs"] Mar 19 20:11:23 crc kubenswrapper[4799]: I0319 20:11:23.040024 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" event={"ID":"c8696404-9e41-478d-86ec-0382990a925f","Type":"ContainerStarted","Data":"bdb9090313a77c8100b4d1c5913092b3f9170d19f1b13ab088af27eafa65cef5"} Mar 19 20:11:23 crc kubenswrapper[4799]: I0319 20:11:23.040097 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" event={"ID":"c8696404-9e41-478d-86ec-0382990a925f","Type":"ContainerStarted","Data":"34d150daff5e306e0018a6cb38d0dd797bfa3c07ce360c82a562f38805b887d4"} Mar 19 20:11:23 crc kubenswrapper[4799]: I0319 20:11:23.040230 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:23 crc kubenswrapper[4799]: I0319 20:11:23.071297 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" podStartSLOduration=2.071270937 podStartE2EDuration="2.071270937s" podCreationTimestamp="2026-03-19 20:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:11:23.066211498 +0000 UTC m=+360.672164570" watchObservedRunningTime="2026-03-19 20:11:23.071270937 +0000 UTC m=+360.677224039" Mar 19 20:11:41 crc kubenswrapper[4799]: I0319 20:11:41.949863 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-grlqs" Mar 19 20:11:42 crc kubenswrapper[4799]: I0319 20:11:42.019832 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbgk7"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.916198 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lf2lf"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.918626 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lf2lf" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="registry-server" containerID="cri-o://e57a2e803b83ab7674ce28943dc2ded590399d89fe3a57d3183069d7ac4756fe" gracePeriod=30 Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.926153 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2gp4f"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.926729 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2gp4f" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="registry-server" containerID="cri-o://cb60445b1d4c3eae208e85e127a8f2f163fb086903f8eea73854648bcebd72fb" gracePeriod=30 Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.938064 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wjn4f"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.938281 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" containerID="cri-o://389a918cd6480682b65d0827bdf2697dbc887e13162cc5866f65d190a4466074" gracePeriod=30 Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.950608 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4g9c"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.951099 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m4g9c" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="registry-server" containerID="cri-o://f89f71ec26713ac35cc2030ba689812451062b038978170b0e8787322de311b5" gracePeriod=30 Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.960661 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6hbz"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.962122 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.964069 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2dks"] Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.964402 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k2dks" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="registry-server" containerID="cri-o://47874d181aea8606c62bcfbb60557de06f8956f419be258bdc88d203b19e29cc" gracePeriod=30 Mar 19 20:11:46 crc kubenswrapper[4799]: I0319 20:11:46.967316 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6hbz"] Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.093997 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40e9b27a-0c5f-45a3-b424-d1b289b65167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.094086 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp7kf\" (UniqueName: \"kubernetes.io/projected/40e9b27a-0c5f-45a3-b424-d1b289b65167-kube-api-access-gp7kf\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.094139 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40e9b27a-0c5f-45a3-b424-d1b289b65167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.185502 4799 generic.go:334] "Generic (PLEG): container finished" podID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerID="389a918cd6480682b65d0827bdf2697dbc887e13162cc5866f65d190a4466074" exitCode=0 Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.185604 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" event={"ID":"e7a739cc-cb77-45dc-9811-661046ccf05b","Type":"ContainerDied","Data":"389a918cd6480682b65d0827bdf2697dbc887e13162cc5866f65d190a4466074"} Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.185674 4799 scope.go:117] "RemoveContainer" containerID="cd3deca1b9f8858ef86103f41d40b0d37ff03a0f5444ba684060abe85cac4b48" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.187947 4799 generic.go:334] "Generic (PLEG): container finished" podID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerID="f89f71ec26713ac35cc2030ba689812451062b038978170b0e8787322de311b5" exitCode=0 Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.188009 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerDied","Data":"f89f71ec26713ac35cc2030ba689812451062b038978170b0e8787322de311b5"} Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.189956 4799 generic.go:334] "Generic (PLEG): container finished" podID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerID="47874d181aea8606c62bcfbb60557de06f8956f419be258bdc88d203b19e29cc" exitCode=0 Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.190029 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerDied","Data":"47874d181aea8606c62bcfbb60557de06f8956f419be258bdc88d203b19e29cc"} Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.192050 4799 generic.go:334] "Generic (PLEG): container finished" podID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerID="cb60445b1d4c3eae208e85e127a8f2f163fb086903f8eea73854648bcebd72fb" exitCode=0 Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.192116 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gp4f" event={"ID":"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40","Type":"ContainerDied","Data":"cb60445b1d4c3eae208e85e127a8f2f163fb086903f8eea73854648bcebd72fb"} Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.194874 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp7kf\" (UniqueName: \"kubernetes.io/projected/40e9b27a-0c5f-45a3-b424-d1b289b65167-kube-api-access-gp7kf\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.194938 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40e9b27a-0c5f-45a3-b424-d1b289b65167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.194979 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40e9b27a-0c5f-45a3-b424-d1b289b65167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.195496 4799 generic.go:334] "Generic (PLEG): container finished" podID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerID="e57a2e803b83ab7674ce28943dc2ded590399d89fe3a57d3183069d7ac4756fe" exitCode=0 Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.195545 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerDied","Data":"e57a2e803b83ab7674ce28943dc2ded590399d89fe3a57d3183069d7ac4756fe"} Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.195998 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40e9b27a-0c5f-45a3-b424-d1b289b65167-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.202087 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40e9b27a-0c5f-45a3-b424-d1b289b65167-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.213233 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp7kf\" (UniqueName: \"kubernetes.io/projected/40e9b27a-0c5f-45a3-b424-d1b289b65167-kube-api-access-gp7kf\") pod \"marketplace-operator-79b997595-s6hbz\" (UID: \"40e9b27a-0c5f-45a3-b424-d1b289b65167\") " pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.292925 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.458668 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.466514 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.479557 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.487976 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.499444 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598719 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfj45\" (UniqueName: \"kubernetes.io/projected/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-kube-api-access-bfj45\") pod \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598776 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-trusted-ca\") pod \"e7a739cc-cb77-45dc-9811-661046ccf05b\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598803 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-utilities\") pod \"3b719eba-9287-4b38-9749-f5c2e09e32e4\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598830 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-operator-metrics\") pod \"e7a739cc-cb77-45dc-9811-661046ccf05b\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598873 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-catalog-content\") pod \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598905 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-catalog-content\") pod \"3b719eba-9287-4b38-9749-f5c2e09e32e4\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598933 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-catalog-content\") pod \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598956 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-utilities\") pod \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\" (UID: \"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598975 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtwfc\" (UniqueName: \"kubernetes.io/projected/0dfa7f99-f171-41a9-8931-07caaeaa06e1-kube-api-access-wtwfc\") pod \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.598994 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb5dx\" (UniqueName: \"kubernetes.io/projected/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-kube-api-access-cb5dx\") pod \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599011 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5npwg\" (UniqueName: \"kubernetes.io/projected/3b719eba-9287-4b38-9749-f5c2e09e32e4-kube-api-access-5npwg\") pod \"3b719eba-9287-4b38-9749-f5c2e09e32e4\" (UID: \"3b719eba-9287-4b38-9749-f5c2e09e32e4\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599028 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-utilities\") pod \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599061 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-catalog-content\") pod \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\" (UID: \"0dfa7f99-f171-41a9-8931-07caaeaa06e1\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599079 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-utilities\") pod \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\" (UID: \"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599094 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgj9w\" (UniqueName: \"kubernetes.io/projected/e7a739cc-cb77-45dc-9811-661046ccf05b-kube-api-access-tgj9w\") pod \"e7a739cc-cb77-45dc-9811-661046ccf05b\" (UID: \"e7a739cc-cb77-45dc-9811-661046ccf05b\") " Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599882 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-utilities" (OuterVolumeSpecName: "utilities") pod "3b719eba-9287-4b38-9749-f5c2e09e32e4" (UID: "3b719eba-9287-4b38-9749-f5c2e09e32e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599939 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-utilities" (OuterVolumeSpecName: "utilities") pod "03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" (UID: "03f0bc9a-1283-46a4-a6df-05d8ee9c3c40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.599964 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e7a739cc-cb77-45dc-9811-661046ccf05b" (UID: "e7a739cc-cb77-45dc-9811-661046ccf05b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.600985 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-utilities" (OuterVolumeSpecName: "utilities") pod "0dfa7f99-f171-41a9-8931-07caaeaa06e1" (UID: "0dfa7f99-f171-41a9-8931-07caaeaa06e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.601028 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-utilities" (OuterVolumeSpecName: "utilities") pod "a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" (UID: "a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.603706 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e7a739cc-cb77-45dc-9811-661046ccf05b" (UID: "e7a739cc-cb77-45dc-9811-661046ccf05b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.603884 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-kube-api-access-cb5dx" (OuterVolumeSpecName: "kube-api-access-cb5dx") pod "a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" (UID: "a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e"). InnerVolumeSpecName "kube-api-access-cb5dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.603970 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b719eba-9287-4b38-9749-f5c2e09e32e4-kube-api-access-5npwg" (OuterVolumeSpecName: "kube-api-access-5npwg") pod "3b719eba-9287-4b38-9749-f5c2e09e32e4" (UID: "3b719eba-9287-4b38-9749-f5c2e09e32e4"). InnerVolumeSpecName "kube-api-access-5npwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.603974 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfa7f99-f171-41a9-8931-07caaeaa06e1-kube-api-access-wtwfc" (OuterVolumeSpecName: "kube-api-access-wtwfc") pod "0dfa7f99-f171-41a9-8931-07caaeaa06e1" (UID: "0dfa7f99-f171-41a9-8931-07caaeaa06e1"). InnerVolumeSpecName "kube-api-access-wtwfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.604051 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a739cc-cb77-45dc-9811-661046ccf05b-kube-api-access-tgj9w" (OuterVolumeSpecName: "kube-api-access-tgj9w") pod "e7a739cc-cb77-45dc-9811-661046ccf05b" (UID: "e7a739cc-cb77-45dc-9811-661046ccf05b"). InnerVolumeSpecName "kube-api-access-tgj9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.607275 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-kube-api-access-bfj45" (OuterVolumeSpecName: "kube-api-access-bfj45") pod "03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" (UID: "03f0bc9a-1283-46a4-a6df-05d8ee9c3c40"). InnerVolumeSpecName "kube-api-access-bfj45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.640406 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" (UID: "a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.654612 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" (UID: "03f0bc9a-1283-46a4-a6df-05d8ee9c3c40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.657449 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dfa7f99-f171-41a9-8931-07caaeaa06e1" (UID: "0dfa7f99-f171-41a9-8931-07caaeaa06e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700257 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700300 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtwfc\" (UniqueName: \"kubernetes.io/projected/0dfa7f99-f171-41a9-8931-07caaeaa06e1-kube-api-access-wtwfc\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700332 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700344 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb5dx\" (UniqueName: \"kubernetes.io/projected/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-kube-api-access-cb5dx\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700356 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5npwg\" (UniqueName: \"kubernetes.io/projected/3b719eba-9287-4b38-9749-f5c2e09e32e4-kube-api-access-5npwg\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700368 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700406 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dfa7f99-f171-41a9-8931-07caaeaa06e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700419 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700429 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgj9w\" (UniqueName: \"kubernetes.io/projected/e7a739cc-cb77-45dc-9811-661046ccf05b-kube-api-access-tgj9w\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700439 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfj45\" (UniqueName: \"kubernetes.io/projected/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40-kube-api-access-bfj45\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700450 4799 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700460 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700471 4799 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e7a739cc-cb77-45dc-9811-661046ccf05b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.700483 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.741059 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b719eba-9287-4b38-9749-f5c2e09e32e4" (UID: "3b719eba-9287-4b38-9749-f5c2e09e32e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.773999 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s6hbz"] Mar 19 20:11:47 crc kubenswrapper[4799]: W0319 20:11:47.782088 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e9b27a_0c5f_45a3_b424_d1b289b65167.slice/crio-e68085773b134fad895660acd8878bd351dc0068c5650af0a3bdbd533db5947e WatchSource:0}: Error finding container e68085773b134fad895660acd8878bd351dc0068c5650af0a3bdbd533db5947e: Status 404 returned error can't find the container with id e68085773b134fad895660acd8878bd351dc0068c5650af0a3bdbd533db5947e Mar 19 20:11:47 crc kubenswrapper[4799]: I0319 20:11:47.802026 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b719eba-9287-4b38-9749-f5c2e09e32e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.205466 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf2lf" event={"ID":"0dfa7f99-f171-41a9-8931-07caaeaa06e1","Type":"ContainerDied","Data":"190353479dd68dc5e8d42761ffede034a1ae51f50e6154cfc35cea929e55a255"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.205527 4799 scope.go:117] "RemoveContainer" containerID="e57a2e803b83ab7674ce28943dc2ded590399d89fe3a57d3183069d7ac4756fe" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.205535 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf2lf" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.209032 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" event={"ID":"40e9b27a-0c5f-45a3-b424-d1b289b65167","Type":"ContainerStarted","Data":"a45d960da66948d95e62907cdd472e66bb1bd5ecea33b498fa15d3193e755add"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.209096 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" event={"ID":"40e9b27a-0c5f-45a3-b424-d1b289b65167","Type":"ContainerStarted","Data":"e68085773b134fad895660acd8878bd351dc0068c5650af0a3bdbd533db5947e"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.212979 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" event={"ID":"e7a739cc-cb77-45dc-9811-661046ccf05b","Type":"ContainerDied","Data":"6ee473103d264e9b3b762a626c12fd6bfc7ca00a408944ae24edffc3e21f832a"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.212988 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wjn4f" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.216278 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4g9c" event={"ID":"a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e","Type":"ContainerDied","Data":"a4bc170b685aa67ff2b4435f94a9a0fe555087949ce845c0c24d65a6afa4d10f"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.216348 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4g9c" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.228730 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2dks" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.228752 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2dks" event={"ID":"3b719eba-9287-4b38-9749-f5c2e09e32e4","Type":"ContainerDied","Data":"706ec47fbc9175166d46dab2382c0256e641f7d9aa4ca1420fca117556119fd2"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.234245 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" podStartSLOduration=2.234224028 podStartE2EDuration="2.234224028s" podCreationTimestamp="2026-03-19 20:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:11:48.231744245 +0000 UTC m=+385.837697347" watchObservedRunningTime="2026-03-19 20:11:48.234224028 +0000 UTC m=+385.840177130" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.246082 4799 scope.go:117] "RemoveContainer" containerID="62e948aad1826163f2570189998fb61749dd6477f308ce35ec2c90edbf07fddd" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.246235 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gp4f" event={"ID":"03f0bc9a-1283-46a4-a6df-05d8ee9c3c40","Type":"ContainerDied","Data":"645fbaf0afe66d5d1c6c4939df667c813994072ef6a7d9530571b233a15ea2d4"} Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.246429 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gp4f" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.265475 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lf2lf"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.269439 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lf2lf"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.288925 4799 scope.go:117] "RemoveContainer" containerID="23ed53307635e45e4bd4ce99a1064af92130379499e66d59bec054dd86acb90f" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.294949 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wjn4f"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.299779 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wjn4f"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.306425 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4g9c"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.313087 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4g9c"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.316157 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2dks"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.321139 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k2dks"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.324051 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2gp4f"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.326621 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2gp4f"] Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.329924 4799 scope.go:117] "RemoveContainer" containerID="389a918cd6480682b65d0827bdf2697dbc887e13162cc5866f65d190a4466074" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.341205 4799 scope.go:117] "RemoveContainer" containerID="f89f71ec26713ac35cc2030ba689812451062b038978170b0e8787322de311b5" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.355022 4799 scope.go:117] "RemoveContainer" containerID="e1861d460b2af27d98d8f1e474d168a6ef51077309b26a7494ae75630aa450a7" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.367484 4799 scope.go:117] "RemoveContainer" containerID="f2770ee94941c7eafee41b4b2c52c16dfb2ae9655036fff1828435ad8305240d" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.385876 4799 scope.go:117] "RemoveContainer" containerID="47874d181aea8606c62bcfbb60557de06f8956f419be258bdc88d203b19e29cc" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.409409 4799 scope.go:117] "RemoveContainer" containerID="9bca2bfa6141ae9428d30cb75a60bad31ae20ec0c1e6cdd7cdf82494c479c3cf" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.422440 4799 scope.go:117] "RemoveContainer" containerID="b770c88b279e1812d107440ff3cda4f6a9de11ed9bb8d61247a2d2ef0ec9ab50" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.437337 4799 scope.go:117] "RemoveContainer" containerID="cb60445b1d4c3eae208e85e127a8f2f163fb086903f8eea73854648bcebd72fb" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.449910 4799 scope.go:117] "RemoveContainer" containerID="1133972b262689629f79ba55be1b1b72d5622255a70981f85f69392e63511a95" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.465701 4799 scope.go:117] "RemoveContainer" containerID="0af840e72eff2e61f1380262ed655205ee94d6cba83fd43e9c8dbe2b1b6234ac" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.929990 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5gh9p"] Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.930792 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.930827 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.930843 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.930855 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.930878 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.930890 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.930911 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.930924 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.930941 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.930952 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.930969 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.930980 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931002 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931013 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931025 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931037 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931052 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931064 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931080 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931092 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931118 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931130 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931146 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931158 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="extract-content" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931179 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931190 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="extract-utilities" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931410 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931439 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931465 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931485 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931507 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931522 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" containerName="registry-server" Mar 19 20:11:48 crc kubenswrapper[4799]: E0319 20:11:48.931708 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.931729 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" containerName="marketplace-operator" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.936499 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.938522 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 20:11:48 crc kubenswrapper[4799]: I0319 20:11:48.953166 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5gh9p"] Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.116615 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-utilities\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.116756 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-catalog-content\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.117286 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgc7k\" (UniqueName: \"kubernetes.io/projected/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-kube-api-access-qgc7k\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.127889 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f0bc9a-1283-46a4-a6df-05d8ee9c3c40" path="/var/lib/kubelet/pods/03f0bc9a-1283-46a4-a6df-05d8ee9c3c40/volumes" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.128509 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dfa7f99-f171-41a9-8931-07caaeaa06e1" path="/var/lib/kubelet/pods/0dfa7f99-f171-41a9-8931-07caaeaa06e1/volumes" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.129134 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b719eba-9287-4b38-9749-f5c2e09e32e4" path="/var/lib/kubelet/pods/3b719eba-9287-4b38-9749-f5c2e09e32e4/volumes" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.130241 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e" path="/var/lib/kubelet/pods/a58b5c1c-2c25-4bb0-b457-7c30bc24ce4e/volumes" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.131039 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a739cc-cb77-45dc-9811-661046ccf05b" path="/var/lib/kubelet/pods/e7a739cc-cb77-45dc-9811-661046ccf05b/volumes" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.218561 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgc7k\" (UniqueName: \"kubernetes.io/projected/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-kube-api-access-qgc7k\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.219375 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-utilities\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.219539 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-catalog-content\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.219953 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-catalog-content\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.220266 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-utilities\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.248428 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgc7k\" (UniqueName: \"kubernetes.io/projected/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-kube-api-access-qgc7k\") pod \"certified-operators-5gh9p\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.254350 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.263577 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.267520 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-s6hbz" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.531607 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tw9x"] Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.533210 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.536050 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.552671 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tw9x"] Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.631714 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd60eeaa-edc8-447f-866a-f20eefff40c8-utilities\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.631808 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd60eeaa-edc8-447f-866a-f20eefff40c8-catalog-content\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.631830 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgbz\" (UniqueName: \"kubernetes.io/projected/dd60eeaa-edc8-447f-866a-f20eefff40c8-kube-api-access-9sgbz\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.661317 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5gh9p"] Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.732523 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd60eeaa-edc8-447f-866a-f20eefff40c8-catalog-content\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.732572 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgbz\" (UniqueName: \"kubernetes.io/projected/dd60eeaa-edc8-447f-866a-f20eefff40c8-kube-api-access-9sgbz\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.732606 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd60eeaa-edc8-447f-866a-f20eefff40c8-utilities\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.733109 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd60eeaa-edc8-447f-866a-f20eefff40c8-utilities\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.733434 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd60eeaa-edc8-447f-866a-f20eefff40c8-catalog-content\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.754502 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgbz\" (UniqueName: \"kubernetes.io/projected/dd60eeaa-edc8-447f-866a-f20eefff40c8-kube-api-access-9sgbz\") pod \"redhat-marketplace-2tw9x\" (UID: \"dd60eeaa-edc8-447f-866a-f20eefff40c8\") " pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:49 crc kubenswrapper[4799]: I0319 20:11:49.860206 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.107850 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tw9x"] Mar 19 20:11:50 crc kubenswrapper[4799]: W0319 20:11:50.116210 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd60eeaa_edc8_447f_866a_f20eefff40c8.slice/crio-73bb498540716e01db77cf6ce043a1c038425552c81e5d4cd68ac5c949175ff1 WatchSource:0}: Error finding container 73bb498540716e01db77cf6ce043a1c038425552c81e5d4cd68ac5c949175ff1: Status 404 returned error can't find the container with id 73bb498540716e01db77cf6ce043a1c038425552c81e5d4cd68ac5c949175ff1 Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.270278 4799 generic.go:334] "Generic (PLEG): container finished" podID="dd60eeaa-edc8-447f-866a-f20eefff40c8" containerID="1f8323ce6f766ad8a25c9c8806f2d0f04b1e1901204d695a3b2a21e11e42db61" exitCode=0 Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.270354 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tw9x" event={"ID":"dd60eeaa-edc8-447f-866a-f20eefff40c8","Type":"ContainerDied","Data":"1f8323ce6f766ad8a25c9c8806f2d0f04b1e1901204d695a3b2a21e11e42db61"} Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.270429 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tw9x" event={"ID":"dd60eeaa-edc8-447f-866a-f20eefff40c8","Type":"ContainerStarted","Data":"73bb498540716e01db77cf6ce043a1c038425552c81e5d4cd68ac5c949175ff1"} Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.271511 4799 generic.go:334] "Generic (PLEG): container finished" podID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerID="347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7" exitCode=0 Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.272071 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gh9p" event={"ID":"6a0a44ba-035d-4c3e-b653-5fc884ab78dc","Type":"ContainerDied","Data":"347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7"} Mar 19 20:11:50 crc kubenswrapper[4799]: I0319 20:11:50.272101 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gh9p" event={"ID":"6a0a44ba-035d-4c3e-b653-5fc884ab78dc","Type":"ContainerStarted","Data":"d5360605e39a400471bb910988ef617129b7c1de2b889de25283514b4336178d"} Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.332705 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwqtk"] Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.335793 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.339816 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.352095 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwqtk"] Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.458403 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-utilities\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.458512 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vp6g\" (UniqueName: \"kubernetes.io/projected/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-kube-api-access-6vp6g\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.458561 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-catalog-content\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.559681 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-utilities\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.559750 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vp6g\" (UniqueName: \"kubernetes.io/projected/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-kube-api-access-6vp6g\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.559783 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-catalog-content\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.560235 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-catalog-content\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.560278 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-utilities\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.595445 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vp6g\" (UniqueName: \"kubernetes.io/projected/23e7994c-4b3e-4f89-bf49-4166fd9a2a78-kube-api-access-6vp6g\") pod \"redhat-operators-zwqtk\" (UID: \"23e7994c-4b3e-4f89-bf49-4166fd9a2a78\") " pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.670877 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.929993 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kxq57"] Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.931224 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.934183 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.964986 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kxq57"] Mar 19 20:11:51 crc kubenswrapper[4799]: I0319 20:11:51.968870 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwqtk"] Mar 19 20:11:51 crc kubenswrapper[4799]: W0319 20:11:51.971545 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e7994c_4b3e_4f89_bf49_4166fd9a2a78.slice/crio-55ae968e57e26bc3328c19a0d3ca52bc0ff38296b2c9beb1633e7198f80d283a WatchSource:0}: Error finding container 55ae968e57e26bc3328c19a0d3ca52bc0ff38296b2c9beb1633e7198f80d283a: Status 404 returned error can't find the container with id 55ae968e57e26bc3328c19a0d3ca52bc0ff38296b2c9beb1633e7198f80d283a Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.065807 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b33abb02-78fb-4afe-af5b-f754a26df60c-catalog-content\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.066051 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9w9j\" (UniqueName: \"kubernetes.io/projected/b33abb02-78fb-4afe-af5b-f754a26df60c-kube-api-access-j9w9j\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.066259 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b33abb02-78fb-4afe-af5b-f754a26df60c-utilities\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.167952 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b33abb02-78fb-4afe-af5b-f754a26df60c-utilities\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.168237 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b33abb02-78fb-4afe-af5b-f754a26df60c-catalog-content\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.168430 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9w9j\" (UniqueName: \"kubernetes.io/projected/b33abb02-78fb-4afe-af5b-f754a26df60c-kube-api-access-j9w9j\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.169714 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b33abb02-78fb-4afe-af5b-f754a26df60c-catalog-content\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.169758 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b33abb02-78fb-4afe-af5b-f754a26df60c-utilities\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.191285 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9w9j\" (UniqueName: \"kubernetes.io/projected/b33abb02-78fb-4afe-af5b-f754a26df60c-kube-api-access-j9w9j\") pod \"community-operators-kxq57\" (UID: \"b33abb02-78fb-4afe-af5b-f754a26df60c\") " pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.265973 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.287317 4799 generic.go:334] "Generic (PLEG): container finished" podID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerID="57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3" exitCode=0 Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.287402 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gh9p" event={"ID":"6a0a44ba-035d-4c3e-b653-5fc884ab78dc","Type":"ContainerDied","Data":"57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3"} Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.289788 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwqtk" event={"ID":"23e7994c-4b3e-4f89-bf49-4166fd9a2a78","Type":"ContainerDied","Data":"ee75a937c691461a41cfd1696b6ee7523f08faa772e7630018a9c777221c6267"} Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.289919 4799 generic.go:334] "Generic (PLEG): container finished" podID="23e7994c-4b3e-4f89-bf49-4166fd9a2a78" containerID="ee75a937c691461a41cfd1696b6ee7523f08faa772e7630018a9c777221c6267" exitCode=0 Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.290055 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwqtk" event={"ID":"23e7994c-4b3e-4f89-bf49-4166fd9a2a78","Type":"ContainerStarted","Data":"55ae968e57e26bc3328c19a0d3ca52bc0ff38296b2c9beb1633e7198f80d283a"} Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.299357 4799 generic.go:334] "Generic (PLEG): container finished" podID="dd60eeaa-edc8-447f-866a-f20eefff40c8" containerID="58a35d899ec9d0c4412c85eb0d009f069c36a7072dbfd3adc5026b7d25e6a5b9" exitCode=0 Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.299468 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tw9x" event={"ID":"dd60eeaa-edc8-447f-866a-f20eefff40c8","Type":"ContainerDied","Data":"58a35d899ec9d0c4412c85eb0d009f069c36a7072dbfd3adc5026b7d25e6a5b9"} Mar 19 20:11:52 crc kubenswrapper[4799]: I0319 20:11:52.711847 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kxq57"] Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.310241 4799 generic.go:334] "Generic (PLEG): container finished" podID="b33abb02-78fb-4afe-af5b-f754a26df60c" containerID="1b2b538014a202ac1a7c53ff336047d0443166c1a291cfa5f0aded81fcdf72ba" exitCode=0 Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.310293 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kxq57" event={"ID":"b33abb02-78fb-4afe-af5b-f754a26df60c","Type":"ContainerDied","Data":"1b2b538014a202ac1a7c53ff336047d0443166c1a291cfa5f0aded81fcdf72ba"} Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.310582 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kxq57" event={"ID":"b33abb02-78fb-4afe-af5b-f754a26df60c","Type":"ContainerStarted","Data":"e9b406c33c7a68aec6340c5ae5d5e78975ac54a12af8a1616ef7dbebc93b5868"} Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.314025 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tw9x" event={"ID":"dd60eeaa-edc8-447f-866a-f20eefff40c8","Type":"ContainerStarted","Data":"5e8ee66ff8671c9b456f87f764c041aed269d714072c4fdd0d84fe17a4938473"} Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.317316 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gh9p" event={"ID":"6a0a44ba-035d-4c3e-b653-5fc884ab78dc","Type":"ContainerStarted","Data":"327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d"} Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.371802 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5gh9p" podStartSLOduration=2.893796581 podStartE2EDuration="5.371778791s" podCreationTimestamp="2026-03-19 20:11:48 +0000 UTC" firstStartedPulling="2026-03-19 20:11:50.273607327 +0000 UTC m=+387.879560399" lastFinishedPulling="2026-03-19 20:11:52.751589497 +0000 UTC m=+390.357542609" observedRunningTime="2026-03-19 20:11:53.368371498 +0000 UTC m=+390.974324580" watchObservedRunningTime="2026-03-19 20:11:53.371778791 +0000 UTC m=+390.977731863" Mar 19 20:11:53 crc kubenswrapper[4799]: I0319 20:11:53.392716 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tw9x" podStartSLOduration=1.731799079 podStartE2EDuration="4.392657666s" podCreationTimestamp="2026-03-19 20:11:49 +0000 UTC" firstStartedPulling="2026-03-19 20:11:50.273476063 +0000 UTC m=+387.879429135" lastFinishedPulling="2026-03-19 20:11:52.93433464 +0000 UTC m=+390.540287722" observedRunningTime="2026-03-19 20:11:53.389553843 +0000 UTC m=+390.995506925" watchObservedRunningTime="2026-03-19 20:11:53.392657666 +0000 UTC m=+390.998610748" Mar 19 20:11:54 crc kubenswrapper[4799]: I0319 20:11:54.324260 4799 generic.go:334] "Generic (PLEG): container finished" podID="23e7994c-4b3e-4f89-bf49-4166fd9a2a78" containerID="f5f4af15e79fa73eb7410fdf2604493a2a29514961b3ef170c36afb42ae2396f" exitCode=0 Mar 19 20:11:54 crc kubenswrapper[4799]: I0319 20:11:54.324482 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwqtk" event={"ID":"23e7994c-4b3e-4f89-bf49-4166fd9a2a78","Type":"ContainerDied","Data":"f5f4af15e79fa73eb7410fdf2604493a2a29514961b3ef170c36afb42ae2396f"} Mar 19 20:11:55 crc kubenswrapper[4799]: I0319 20:11:55.333271 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwqtk" event={"ID":"23e7994c-4b3e-4f89-bf49-4166fd9a2a78","Type":"ContainerStarted","Data":"2b1dbe9491dd89b3857749349893491a4c059969ed47dc3f40167e245b943d38"} Mar 19 20:11:55 crc kubenswrapper[4799]: I0319 20:11:55.335488 4799 generic.go:334] "Generic (PLEG): container finished" podID="b33abb02-78fb-4afe-af5b-f754a26df60c" containerID="c86dea98957c17a832c551466f8afaaefa15392dbc133fed8102fae0b420687f" exitCode=0 Mar 19 20:11:55 crc kubenswrapper[4799]: I0319 20:11:55.335526 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kxq57" event={"ID":"b33abb02-78fb-4afe-af5b-f754a26df60c","Type":"ContainerDied","Data":"c86dea98957c17a832c551466f8afaaefa15392dbc133fed8102fae0b420687f"} Mar 19 20:11:55 crc kubenswrapper[4799]: I0319 20:11:55.364898 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwqtk" podStartSLOduration=1.845221432 podStartE2EDuration="4.364867329s" podCreationTimestamp="2026-03-19 20:11:51 +0000 UTC" firstStartedPulling="2026-03-19 20:11:52.29519406 +0000 UTC m=+389.901147162" lastFinishedPulling="2026-03-19 20:11:54.814839947 +0000 UTC m=+392.420793059" observedRunningTime="2026-03-19 20:11:55.3572182 +0000 UTC m=+392.963171302" watchObservedRunningTime="2026-03-19 20:11:55.364867329 +0000 UTC m=+392.970820441" Mar 19 20:11:56 crc kubenswrapper[4799]: I0319 20:11:56.360507 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kxq57" event={"ID":"b33abb02-78fb-4afe-af5b-f754a26df60c","Type":"ContainerStarted","Data":"f107073cce92f2821f746889236b129f1a96079f1ad297296d5557c555faa0d0"} Mar 19 20:11:56 crc kubenswrapper[4799]: I0319 20:11:56.385706 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kxq57" podStartSLOduration=2.721730409 podStartE2EDuration="5.385682639s" podCreationTimestamp="2026-03-19 20:11:51 +0000 UTC" firstStartedPulling="2026-03-19 20:11:53.3123386 +0000 UTC m=+390.918291722" lastFinishedPulling="2026-03-19 20:11:55.97629084 +0000 UTC m=+393.582243952" observedRunningTime="2026-03-19 20:11:56.383253646 +0000 UTC m=+393.989206748" watchObservedRunningTime="2026-03-19 20:11:56.385682639 +0000 UTC m=+393.991635751" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.255452 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.255866 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.325080 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.432285 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.860777 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.860854 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:11:59 crc kubenswrapper[4799]: I0319 20:11:59.937983 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.144784 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565852-cqlgg"] Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.145377 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.149050 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.149274 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.150353 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.162897 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-cqlgg"] Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.289497 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzs2\" (UniqueName: \"kubernetes.io/projected/79c55479-c3ec-4e72-b12d-d287a8c82f42-kube-api-access-qfzs2\") pod \"auto-csr-approver-29565852-cqlgg\" (UID: \"79c55479-c3ec-4e72-b12d-d287a8c82f42\") " pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.391702 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzs2\" (UniqueName: \"kubernetes.io/projected/79c55479-c3ec-4e72-b12d-d287a8c82f42-kube-api-access-qfzs2\") pod \"auto-csr-approver-29565852-cqlgg\" (UID: \"79c55479-c3ec-4e72-b12d-d287a8c82f42\") " pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.437354 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzs2\" (UniqueName: \"kubernetes.io/projected/79c55479-c3ec-4e72-b12d-d287a8c82f42-kube-api-access-qfzs2\") pod \"auto-csr-approver-29565852-cqlgg\" (UID: \"79c55479-c3ec-4e72-b12d-d287a8c82f42\") " pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.438033 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tw9x" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.494025 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:00 crc kubenswrapper[4799]: I0319 20:12:00.903774 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-cqlgg"] Mar 19 20:12:00 crc kubenswrapper[4799]: W0319 20:12:00.915069 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c55479_c3ec_4e72_b12d_d287a8c82f42.slice/crio-af29edf335f62beeb5beae46c08bc6b0b8aa8b0f09aea5e2c8a387cbeb2b3c38 WatchSource:0}: Error finding container af29edf335f62beeb5beae46c08bc6b0b8aa8b0f09aea5e2c8a387cbeb2b3c38: Status 404 returned error can't find the container with id af29edf335f62beeb5beae46c08bc6b0b8aa8b0f09aea5e2c8a387cbeb2b3c38 Mar 19 20:12:01 crc kubenswrapper[4799]: I0319 20:12:01.392229 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" event={"ID":"79c55479-c3ec-4e72-b12d-d287a8c82f42","Type":"ContainerStarted","Data":"af29edf335f62beeb5beae46c08bc6b0b8aa8b0f09aea5e2c8a387cbeb2b3c38"} Mar 19 20:12:01 crc kubenswrapper[4799]: I0319 20:12:01.671574 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:12:01 crc kubenswrapper[4799]: I0319 20:12:01.672113 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.266256 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.266328 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.330014 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.398102 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" event={"ID":"79c55479-c3ec-4e72-b12d-d287a8c82f42","Type":"ContainerStarted","Data":"39137657b9976e7dec982390994cc08419caf262a5c3ae241bb7f3d308122b6d"} Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.419964 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" podStartSLOduration=1.357647897 podStartE2EDuration="2.41994542s" podCreationTimestamp="2026-03-19 20:12:00 +0000 UTC" firstStartedPulling="2026-03-19 20:12:00.917492055 +0000 UTC m=+398.523445147" lastFinishedPulling="2026-03-19 20:12:01.979789548 +0000 UTC m=+399.585742670" observedRunningTime="2026-03-19 20:12:02.413151946 +0000 UTC m=+400.019105028" watchObservedRunningTime="2026-03-19 20:12:02.41994542 +0000 UTC m=+400.025898502" Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.438031 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kxq57" Mar 19 20:12:02 crc kubenswrapper[4799]: I0319 20:12:02.742773 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zwqtk" podUID="23e7994c-4b3e-4f89-bf49-4166fd9a2a78" containerName="registry-server" probeResult="failure" output=< Mar 19 20:12:02 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:12:02 crc kubenswrapper[4799]: > Mar 19 20:12:03 crc kubenswrapper[4799]: I0319 20:12:03.405498 4799 generic.go:334] "Generic (PLEG): container finished" podID="79c55479-c3ec-4e72-b12d-d287a8c82f42" containerID="39137657b9976e7dec982390994cc08419caf262a5c3ae241bb7f3d308122b6d" exitCode=0 Mar 19 20:12:03 crc kubenswrapper[4799]: I0319 20:12:03.405596 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" event={"ID":"79c55479-c3ec-4e72-b12d-d287a8c82f42","Type":"ContainerDied","Data":"39137657b9976e7dec982390994cc08419caf262a5c3ae241bb7f3d308122b6d"} Mar 19 20:12:04 crc kubenswrapper[4799]: I0319 20:12:04.642354 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:04 crc kubenswrapper[4799]: I0319 20:12:04.660226 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzs2\" (UniqueName: \"kubernetes.io/projected/79c55479-c3ec-4e72-b12d-d287a8c82f42-kube-api-access-qfzs2\") pod \"79c55479-c3ec-4e72-b12d-d287a8c82f42\" (UID: \"79c55479-c3ec-4e72-b12d-d287a8c82f42\") " Mar 19 20:12:04 crc kubenswrapper[4799]: I0319 20:12:04.666925 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c55479-c3ec-4e72-b12d-d287a8c82f42-kube-api-access-qfzs2" (OuterVolumeSpecName: "kube-api-access-qfzs2") pod "79c55479-c3ec-4e72-b12d-d287a8c82f42" (UID: "79c55479-c3ec-4e72-b12d-d287a8c82f42"). InnerVolumeSpecName "kube-api-access-qfzs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:04 crc kubenswrapper[4799]: I0319 20:12:04.761277 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzs2\" (UniqueName: \"kubernetes.io/projected/79c55479-c3ec-4e72-b12d-d287a8c82f42-kube-api-access-qfzs2\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:05 crc kubenswrapper[4799]: I0319 20:12:05.419051 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" event={"ID":"79c55479-c3ec-4e72-b12d-d287a8c82f42","Type":"ContainerDied","Data":"af29edf335f62beeb5beae46c08bc6b0b8aa8b0f09aea5e2c8a387cbeb2b3c38"} Mar 19 20:12:05 crc kubenswrapper[4799]: I0319 20:12:05.419110 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af29edf335f62beeb5beae46c08bc6b0b8aa8b0f09aea5e2c8a387cbeb2b3c38" Mar 19 20:12:05 crc kubenswrapper[4799]: I0319 20:12:05.419126 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-cqlgg" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.083192 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" podUID="f4f4875b-2470-43e2-aa8a-ae48871e0ca9" containerName="registry" containerID="cri-o://9b01aa447794753c58cfbc9b416eb1515d077b6a64a45d0138fb1d1bc898278c" gracePeriod=30 Mar 19 20:12:07 crc kubenswrapper[4799]: E0319 20:12:07.201235 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f4875b_2470_43e2_aa8a_ae48871e0ca9.slice/crio-conmon-9b01aa447794753c58cfbc9b416eb1515d077b6a64a45d0138fb1d1bc898278c.scope\": RecentStats: unable to find data in memory cache]" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.433642 4799 generic.go:334] "Generic (PLEG): container finished" podID="f4f4875b-2470-43e2-aa8a-ae48871e0ca9" containerID="9b01aa447794753c58cfbc9b416eb1515d077b6a64a45d0138fb1d1bc898278c" exitCode=0 Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.433686 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" event={"ID":"f4f4875b-2470-43e2-aa8a-ae48871e0ca9","Type":"ContainerDied","Data":"9b01aa447794753c58cfbc9b416eb1515d077b6a64a45d0138fb1d1bc898278c"} Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.506190 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.700711 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-certificates\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.700927 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.700972 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-tls\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.700999 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-installation-pull-secrets\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.701048 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqrqq\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-kube-api-access-pqrqq\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.701102 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-ca-trust-extracted\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.701124 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-bound-sa-token\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.701193 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-trusted-ca\") pod \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\" (UID: \"f4f4875b-2470-43e2-aa8a-ae48871e0ca9\") " Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.702497 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.703652 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.706760 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.709173 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.714078 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.714739 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-kube-api-access-pqrqq" (OuterVolumeSpecName: "kube-api-access-pqrqq") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "kube-api-access-pqrqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.717856 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.721036 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f4f4875b-2470-43e2-aa8a-ae48871e0ca9" (UID: "f4f4875b-2470-43e2-aa8a-ae48871e0ca9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.803957 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.804003 4799 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.804019 4799 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.804031 4799 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.804043 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqrqq\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-kube-api-access-pqrqq\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.804055 4799 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:07 crc kubenswrapper[4799]: I0319 20:12:07.804068 4799 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f4875b-2470-43e2-aa8a-ae48871e0ca9-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:08 crc kubenswrapper[4799]: I0319 20:12:08.440206 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" event={"ID":"f4f4875b-2470-43e2-aa8a-ae48871e0ca9","Type":"ContainerDied","Data":"9a3d95bc26d055e2de59859bf5ada1f33ba76bd8fe28d082364c8a8b4aa4a5d9"} Mar 19 20:12:08 crc kubenswrapper[4799]: I0319 20:12:08.440257 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nbgk7" Mar 19 20:12:08 crc kubenswrapper[4799]: I0319 20:12:08.440268 4799 scope.go:117] "RemoveContainer" containerID="9b01aa447794753c58cfbc9b416eb1515d077b6a64a45d0138fb1d1bc898278c" Mar 19 20:12:08 crc kubenswrapper[4799]: I0319 20:12:08.471565 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbgk7"] Mar 19 20:12:08 crc kubenswrapper[4799]: I0319 20:12:08.475438 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nbgk7"] Mar 19 20:12:09 crc kubenswrapper[4799]: I0319 20:12:09.129255 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f4875b-2470-43e2-aa8a-ae48871e0ca9" path="/var/lib/kubelet/pods/f4f4875b-2470-43e2-aa8a-ae48871e0ca9/volumes" Mar 19 20:12:11 crc kubenswrapper[4799]: I0319 20:12:11.740420 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:12:11 crc kubenswrapper[4799]: I0319 20:12:11.789540 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwqtk" Mar 19 20:12:58 crc kubenswrapper[4799]: I0319 20:12:58.756048 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:12:58 crc kubenswrapper[4799]: I0319 20:12:58.756925 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:13:28 crc kubenswrapper[4799]: I0319 20:13:28.756733 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:13:28 crc kubenswrapper[4799]: I0319 20:13:28.757551 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:13:58 crc kubenswrapper[4799]: I0319 20:13:58.755337 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:13:58 crc kubenswrapper[4799]: I0319 20:13:58.755856 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:13:58 crc kubenswrapper[4799]: I0319 20:13:58.755904 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:13:58 crc kubenswrapper[4799]: I0319 20:13:58.756542 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:13:58 crc kubenswrapper[4799]: I0319 20:13:58.756613 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2" gracePeriod=600 Mar 19 20:13:58 crc kubenswrapper[4799]: E0319 20:13:58.883255 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf986000_80c1_4cf1_8648_d2f7ee370e88.slice/crio-conmon-7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf986000_80c1_4cf1_8648_d2f7ee370e88.slice/crio-7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2.scope\": RecentStats: unable to find data in memory cache]" Mar 19 20:13:59 crc kubenswrapper[4799]: I0319 20:13:59.191770 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2" exitCode=0 Mar 19 20:13:59 crc kubenswrapper[4799]: I0319 20:13:59.191831 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2"} Mar 19 20:13:59 crc kubenswrapper[4799]: I0319 20:13:59.192060 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"74d053865b82ea1997ab8189b4caa9e61f0e0eecb338b66d10379da64224a7e7"} Mar 19 20:13:59 crc kubenswrapper[4799]: I0319 20:13:59.192082 4799 scope.go:117] "RemoveContainer" containerID="534a74ab108230f72948e95195e5455a12091dc31df0769e7ef82fde4f483427" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.151714 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565854-fjk2f"] Mar 19 20:14:00 crc kubenswrapper[4799]: E0319 20:14:00.152366 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f4875b-2470-43e2-aa8a-ae48871e0ca9" containerName="registry" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.152413 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f4875b-2470-43e2-aa8a-ae48871e0ca9" containerName="registry" Mar 19 20:14:00 crc kubenswrapper[4799]: E0319 20:14:00.152438 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c55479-c3ec-4e72-b12d-d287a8c82f42" containerName="oc" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.152450 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c55479-c3ec-4e72-b12d-d287a8c82f42" containerName="oc" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.152645 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f4875b-2470-43e2-aa8a-ae48871e0ca9" containerName="registry" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.152669 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c55479-c3ec-4e72-b12d-d287a8c82f42" containerName="oc" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.153220 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.157853 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.158786 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.158983 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.165337 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-fjk2f"] Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.316450 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987tr\" (UniqueName: \"kubernetes.io/projected/3482d963-d9c6-41f9-b382-486e75051602-kube-api-access-987tr\") pod \"auto-csr-approver-29565854-fjk2f\" (UID: \"3482d963-d9c6-41f9-b382-486e75051602\") " pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.417866 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-987tr\" (UniqueName: \"kubernetes.io/projected/3482d963-d9c6-41f9-b382-486e75051602-kube-api-access-987tr\") pod \"auto-csr-approver-29565854-fjk2f\" (UID: \"3482d963-d9c6-41f9-b382-486e75051602\") " pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.450884 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-987tr\" (UniqueName: \"kubernetes.io/projected/3482d963-d9c6-41f9-b382-486e75051602-kube-api-access-987tr\") pod \"auto-csr-approver-29565854-fjk2f\" (UID: \"3482d963-d9c6-41f9-b382-486e75051602\") " pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.483136 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.785148 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-fjk2f"] Mar 19 20:14:00 crc kubenswrapper[4799]: I0319 20:14:00.792237 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:14:01 crc kubenswrapper[4799]: I0319 20:14:01.217111 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" event={"ID":"3482d963-d9c6-41f9-b382-486e75051602","Type":"ContainerStarted","Data":"b6ef8bd4b79f70b69e9f1bfb2a33c85e75f752507cba362391dd771d1921712c"} Mar 19 20:14:02 crc kubenswrapper[4799]: I0319 20:14:02.225736 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" event={"ID":"3482d963-d9c6-41f9-b382-486e75051602","Type":"ContainerStarted","Data":"027b544f1ac7ca66542ecf1d6f504a7e07577eb5791872fd86c10c27bc623cd1"} Mar 19 20:14:02 crc kubenswrapper[4799]: I0319 20:14:02.246508 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" podStartSLOduration=1.250496376 podStartE2EDuration="2.246474199s" podCreationTimestamp="2026-03-19 20:14:00 +0000 UTC" firstStartedPulling="2026-03-19 20:14:00.791678413 +0000 UTC m=+518.397631535" lastFinishedPulling="2026-03-19 20:14:01.787656276 +0000 UTC m=+519.393609358" observedRunningTime="2026-03-19 20:14:02.244299673 +0000 UTC m=+519.850252785" watchObservedRunningTime="2026-03-19 20:14:02.246474199 +0000 UTC m=+519.852427311" Mar 19 20:14:03 crc kubenswrapper[4799]: I0319 20:14:03.237106 4799 generic.go:334] "Generic (PLEG): container finished" podID="3482d963-d9c6-41f9-b382-486e75051602" containerID="027b544f1ac7ca66542ecf1d6f504a7e07577eb5791872fd86c10c27bc623cd1" exitCode=0 Mar 19 20:14:03 crc kubenswrapper[4799]: I0319 20:14:03.237193 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" event={"ID":"3482d963-d9c6-41f9-b382-486e75051602","Type":"ContainerDied","Data":"027b544f1ac7ca66542ecf1d6f504a7e07577eb5791872fd86c10c27bc623cd1"} Mar 19 20:14:04 crc kubenswrapper[4799]: I0319 20:14:04.569600 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:04 crc kubenswrapper[4799]: I0319 20:14:04.676602 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-987tr\" (UniqueName: \"kubernetes.io/projected/3482d963-d9c6-41f9-b382-486e75051602-kube-api-access-987tr\") pod \"3482d963-d9c6-41f9-b382-486e75051602\" (UID: \"3482d963-d9c6-41f9-b382-486e75051602\") " Mar 19 20:14:04 crc kubenswrapper[4799]: I0319 20:14:04.686824 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3482d963-d9c6-41f9-b382-486e75051602-kube-api-access-987tr" (OuterVolumeSpecName: "kube-api-access-987tr") pod "3482d963-d9c6-41f9-b382-486e75051602" (UID: "3482d963-d9c6-41f9-b382-486e75051602"). InnerVolumeSpecName "kube-api-access-987tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:14:04 crc kubenswrapper[4799]: I0319 20:14:04.778006 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-987tr\" (UniqueName: \"kubernetes.io/projected/3482d963-d9c6-41f9-b382-486e75051602-kube-api-access-987tr\") on node \"crc\" DevicePath \"\"" Mar 19 20:14:05 crc kubenswrapper[4799]: I0319 20:14:05.253226 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" event={"ID":"3482d963-d9c6-41f9-b382-486e75051602","Type":"ContainerDied","Data":"b6ef8bd4b79f70b69e9f1bfb2a33c85e75f752507cba362391dd771d1921712c"} Mar 19 20:14:05 crc kubenswrapper[4799]: I0319 20:14:05.253302 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6ef8bd4b79f70b69e9f1bfb2a33c85e75f752507cba362391dd771d1921712c" Mar 19 20:14:05 crc kubenswrapper[4799]: I0319 20:14:05.253446 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-fjk2f" Mar 19 20:14:05 crc kubenswrapper[4799]: I0319 20:14:05.332241 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-5gcs8"] Mar 19 20:14:05 crc kubenswrapper[4799]: I0319 20:14:05.339042 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-5gcs8"] Mar 19 20:14:07 crc kubenswrapper[4799]: I0319 20:14:07.128654 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31aa7077-55c5-426b-a92f-c93b8d767105" path="/var/lib/kubelet/pods/31aa7077-55c5-426b-a92f-c93b8d767105/volumes" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.152832 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk"] Mar 19 20:15:00 crc kubenswrapper[4799]: E0319 20:15:00.154085 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3482d963-d9c6-41f9-b382-486e75051602" containerName="oc" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.154110 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3482d963-d9c6-41f9-b382-486e75051602" containerName="oc" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.154310 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3482d963-d9c6-41f9-b382-486e75051602" containerName="oc" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.154981 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.157940 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.158102 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.169480 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk"] Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.294446 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6313da4-6572-481f-888e-db433419606a-config-volume\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.294534 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcmv\" (UniqueName: \"kubernetes.io/projected/c6313da4-6572-481f-888e-db433419606a-kube-api-access-jdcmv\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.294606 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6313da4-6572-481f-888e-db433419606a-secret-volume\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.396532 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6313da4-6572-481f-888e-db433419606a-config-volume\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.396632 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcmv\" (UniqueName: \"kubernetes.io/projected/c6313da4-6572-481f-888e-db433419606a-kube-api-access-jdcmv\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.396704 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6313da4-6572-481f-888e-db433419606a-secret-volume\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.397679 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6313da4-6572-481f-888e-db433419606a-config-volume\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.406005 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6313da4-6572-481f-888e-db433419606a-secret-volume\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.428576 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcmv\" (UniqueName: \"kubernetes.io/projected/c6313da4-6572-481f-888e-db433419606a-kube-api-access-jdcmv\") pod \"collect-profiles-29565855-dlpsk\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.494175 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:00 crc kubenswrapper[4799]: I0319 20:15:00.732114 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk"] Mar 19 20:15:00 crc kubenswrapper[4799]: W0319 20:15:00.750407 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6313da4_6572_481f_888e_db433419606a.slice/crio-b4682389a120d6c099a2e5bee5b8de54918159cd586794501da69e98a45cab75 WatchSource:0}: Error finding container b4682389a120d6c099a2e5bee5b8de54918159cd586794501da69e98a45cab75: Status 404 returned error can't find the container with id b4682389a120d6c099a2e5bee5b8de54918159cd586794501da69e98a45cab75 Mar 19 20:15:01 crc kubenswrapper[4799]: I0319 20:15:01.670586 4799 generic.go:334] "Generic (PLEG): container finished" podID="c6313da4-6572-481f-888e-db433419606a" containerID="6bb1ff36d0a799238ce792173817baca9762c92e6379ac10e817f97dcea79059" exitCode=0 Mar 19 20:15:01 crc kubenswrapper[4799]: I0319 20:15:01.670686 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" event={"ID":"c6313da4-6572-481f-888e-db433419606a","Type":"ContainerDied","Data":"6bb1ff36d0a799238ce792173817baca9762c92e6379ac10e817f97dcea79059"} Mar 19 20:15:01 crc kubenswrapper[4799]: I0319 20:15:01.670983 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" event={"ID":"c6313da4-6572-481f-888e-db433419606a","Type":"ContainerStarted","Data":"b4682389a120d6c099a2e5bee5b8de54918159cd586794501da69e98a45cab75"} Mar 19 20:15:02 crc kubenswrapper[4799]: I0319 20:15:02.948850 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.139783 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdcmv\" (UniqueName: \"kubernetes.io/projected/c6313da4-6572-481f-888e-db433419606a-kube-api-access-jdcmv\") pod \"c6313da4-6572-481f-888e-db433419606a\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.139921 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6313da4-6572-481f-888e-db433419606a-config-volume\") pod \"c6313da4-6572-481f-888e-db433419606a\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.139982 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6313da4-6572-481f-888e-db433419606a-secret-volume\") pod \"c6313da4-6572-481f-888e-db433419606a\" (UID: \"c6313da4-6572-481f-888e-db433419606a\") " Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.140828 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6313da4-6572-481f-888e-db433419606a-config-volume" (OuterVolumeSpecName: "config-volume") pod "c6313da4-6572-481f-888e-db433419606a" (UID: "c6313da4-6572-481f-888e-db433419606a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.149624 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6313da4-6572-481f-888e-db433419606a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c6313da4-6572-481f-888e-db433419606a" (UID: "c6313da4-6572-481f-888e-db433419606a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.149705 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6313da4-6572-481f-888e-db433419606a-kube-api-access-jdcmv" (OuterVolumeSpecName: "kube-api-access-jdcmv") pod "c6313da4-6572-481f-888e-db433419606a" (UID: "c6313da4-6572-481f-888e-db433419606a"). InnerVolumeSpecName "kube-api-access-jdcmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.242122 4799 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6313da4-6572-481f-888e-db433419606a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.242174 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdcmv\" (UniqueName: \"kubernetes.io/projected/c6313da4-6572-481f-888e-db433419606a-kube-api-access-jdcmv\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.242195 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6313da4-6572-481f-888e-db433419606a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.687322 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" event={"ID":"c6313da4-6572-481f-888e-db433419606a","Type":"ContainerDied","Data":"b4682389a120d6c099a2e5bee5b8de54918159cd586794501da69e98a45cab75"} Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.687777 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4682389a120d6c099a2e5bee5b8de54918159cd586794501da69e98a45cab75" Mar 19 20:15:03 crc kubenswrapper[4799]: I0319 20:15:03.687457 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk" Mar 19 20:15:37 crc kubenswrapper[4799]: I0319 20:15:37.752607 4799 scope.go:117] "RemoveContainer" containerID="f2c69ecf363bed623988c3cdc0da8c035cc1467783192da8b53610c7e2ffdc3f" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.154099 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565856-5p4qg"] Mar 19 20:16:00 crc kubenswrapper[4799]: E0319 20:16:00.155423 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6313da4-6572-481f-888e-db433419606a" containerName="collect-profiles" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.155449 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6313da4-6572-481f-888e-db433419606a" containerName="collect-profiles" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.155642 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6313da4-6572-481f-888e-db433419606a" containerName="collect-profiles" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.158194 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.162951 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.162970 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.162993 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.166288 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-5p4qg"] Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.175324 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjp8\" (UniqueName: \"kubernetes.io/projected/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0-kube-api-access-xnjp8\") pod \"auto-csr-approver-29565856-5p4qg\" (UID: \"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0\") " pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.276461 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjp8\" (UniqueName: \"kubernetes.io/projected/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0-kube-api-access-xnjp8\") pod \"auto-csr-approver-29565856-5p4qg\" (UID: \"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0\") " pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.298752 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjp8\" (UniqueName: \"kubernetes.io/projected/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0-kube-api-access-xnjp8\") pod \"auto-csr-approver-29565856-5p4qg\" (UID: \"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0\") " pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.504643 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:00 crc kubenswrapper[4799]: I0319 20:16:00.954833 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-5p4qg"] Mar 19 20:16:01 crc kubenswrapper[4799]: I0319 20:16:01.128557 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" event={"ID":"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0","Type":"ContainerStarted","Data":"bf4a0ae29a80b07d1665e24c12961b0451b50ba8c6eddabfd30b8df00e02987f"} Mar 19 20:16:03 crc kubenswrapper[4799]: I0319 20:16:03.133223 4799 generic.go:334] "Generic (PLEG): container finished" podID="cc9f86d5-d034-4353-bdb0-67c42ee7d2e0" containerID="996eb7018a44b89623da1ef36c83afec8f759bfe5dc6310c886ef2b0962ebe46" exitCode=0 Mar 19 20:16:03 crc kubenswrapper[4799]: I0319 20:16:03.133328 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" event={"ID":"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0","Type":"ContainerDied","Data":"996eb7018a44b89623da1ef36c83afec8f759bfe5dc6310c886ef2b0962ebe46"} Mar 19 20:16:04 crc kubenswrapper[4799]: I0319 20:16:04.483667 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:04 crc kubenswrapper[4799]: I0319 20:16:04.632080 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnjp8\" (UniqueName: \"kubernetes.io/projected/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0-kube-api-access-xnjp8\") pod \"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0\" (UID: \"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0\") " Mar 19 20:16:04 crc kubenswrapper[4799]: I0319 20:16:04.638517 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0-kube-api-access-xnjp8" (OuterVolumeSpecName: "kube-api-access-xnjp8") pod "cc9f86d5-d034-4353-bdb0-67c42ee7d2e0" (UID: "cc9f86d5-d034-4353-bdb0-67c42ee7d2e0"). InnerVolumeSpecName "kube-api-access-xnjp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:16:04 crc kubenswrapper[4799]: I0319 20:16:04.733603 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnjp8\" (UniqueName: \"kubernetes.io/projected/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0-kube-api-access-xnjp8\") on node \"crc\" DevicePath \"\"" Mar 19 20:16:05 crc kubenswrapper[4799]: I0319 20:16:05.154995 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" event={"ID":"cc9f86d5-d034-4353-bdb0-67c42ee7d2e0","Type":"ContainerDied","Data":"bf4a0ae29a80b07d1665e24c12961b0451b50ba8c6eddabfd30b8df00e02987f"} Mar 19 20:16:05 crc kubenswrapper[4799]: I0319 20:16:05.155742 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4a0ae29a80b07d1665e24c12961b0451b50ba8c6eddabfd30b8df00e02987f" Mar 19 20:16:05 crc kubenswrapper[4799]: I0319 20:16:05.155347 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-5p4qg" Mar 19 20:16:05 crc kubenswrapper[4799]: I0319 20:16:05.572280 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-8c5pj"] Mar 19 20:16:05 crc kubenswrapper[4799]: I0319 20:16:05.578847 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-8c5pj"] Mar 19 20:16:07 crc kubenswrapper[4799]: I0319 20:16:07.126631 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd47b02a-a448-4b51-bec1-977f2ebbc4e2" path="/var/lib/kubelet/pods/cd47b02a-a448-4b51-bec1-977f2ebbc4e2/volumes" Mar 19 20:16:28 crc kubenswrapper[4799]: I0319 20:16:28.756170 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:16:28 crc kubenswrapper[4799]: I0319 20:16:28.756652 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:16:37 crc kubenswrapper[4799]: I0319 20:16:37.819122 4799 scope.go:117] "RemoveContainer" containerID="22b59f0344f3d7f5047b20c8a7d5f8d6917a3385f514e25a0070a94f8f932194" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.214765 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb"] Mar 19 20:16:55 crc kubenswrapper[4799]: E0319 20:16:55.215716 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9f86d5-d034-4353-bdb0-67c42ee7d2e0" containerName="oc" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.215737 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9f86d5-d034-4353-bdb0-67c42ee7d2e0" containerName="oc" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.215918 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9f86d5-d034-4353-bdb0-67c42ee7d2e0" containerName="oc" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.216517 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.225791 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.226333 4799 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qnfsc" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.229034 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.237783 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb"] Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.243613 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jtk48"] Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.244649 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.247066 4799 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pppns" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.276845 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fv54b"] Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.278699 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fv54b" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.281620 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jtk48"] Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.281772 4799 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-kfpm6" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.285708 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fv54b"] Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.309985 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvrtp\" (UniqueName: \"kubernetes.io/projected/5f8bca57-6368-4bd6-9d79-d0e640dd074f-kube-api-access-lvrtp\") pod \"cert-manager-858654f9db-fv54b\" (UID: \"5f8bca57-6368-4bd6-9d79-d0e640dd074f\") " pod="cert-manager/cert-manager-858654f9db-fv54b" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.310057 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgv55\" (UniqueName: \"kubernetes.io/projected/570da1bb-c2ff-40e5-a2b6-352d09168d6d-kube-api-access-jgv55\") pod \"cert-manager-webhook-687f57d79b-jtk48\" (UID: \"570da1bb-c2ff-40e5-a2b6-352d09168d6d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.310098 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hnx2\" (UniqueName: \"kubernetes.io/projected/a471aa18-d5fa-455b-b8a6-395717db50b9-kube-api-access-4hnx2\") pod \"cert-manager-cainjector-cf98fcc89-9ppqb\" (UID: \"a471aa18-d5fa-455b-b8a6-395717db50b9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.411430 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvrtp\" (UniqueName: \"kubernetes.io/projected/5f8bca57-6368-4bd6-9d79-d0e640dd074f-kube-api-access-lvrtp\") pod \"cert-manager-858654f9db-fv54b\" (UID: \"5f8bca57-6368-4bd6-9d79-d0e640dd074f\") " pod="cert-manager/cert-manager-858654f9db-fv54b" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.411513 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgv55\" (UniqueName: \"kubernetes.io/projected/570da1bb-c2ff-40e5-a2b6-352d09168d6d-kube-api-access-jgv55\") pod \"cert-manager-webhook-687f57d79b-jtk48\" (UID: \"570da1bb-c2ff-40e5-a2b6-352d09168d6d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.411541 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hnx2\" (UniqueName: \"kubernetes.io/projected/a471aa18-d5fa-455b-b8a6-395717db50b9-kube-api-access-4hnx2\") pod \"cert-manager-cainjector-cf98fcc89-9ppqb\" (UID: \"a471aa18-d5fa-455b-b8a6-395717db50b9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.430886 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgv55\" (UniqueName: \"kubernetes.io/projected/570da1bb-c2ff-40e5-a2b6-352d09168d6d-kube-api-access-jgv55\") pod \"cert-manager-webhook-687f57d79b-jtk48\" (UID: \"570da1bb-c2ff-40e5-a2b6-352d09168d6d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.430950 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvrtp\" (UniqueName: \"kubernetes.io/projected/5f8bca57-6368-4bd6-9d79-d0e640dd074f-kube-api-access-lvrtp\") pod \"cert-manager-858654f9db-fv54b\" (UID: \"5f8bca57-6368-4bd6-9d79-d0e640dd074f\") " pod="cert-manager/cert-manager-858654f9db-fv54b" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.430997 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hnx2\" (UniqueName: \"kubernetes.io/projected/a471aa18-d5fa-455b-b8a6-395717db50b9-kube-api-access-4hnx2\") pod \"cert-manager-cainjector-cf98fcc89-9ppqb\" (UID: \"a471aa18-d5fa-455b-b8a6-395717db50b9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.537823 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.563811 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.592776 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fv54b" Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.848412 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fv54b"] Mar 19 20:16:55 crc kubenswrapper[4799]: I0319 20:16:55.890636 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jtk48"] Mar 19 20:16:55 crc kubenswrapper[4799]: W0319 20:16:55.891510 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570da1bb_c2ff_40e5_a2b6_352d09168d6d.slice/crio-cdca9a63860f4149b8aa9828bce049a1588bae07b2d0968c8d54db2f8ad637fd WatchSource:0}: Error finding container cdca9a63860f4149b8aa9828bce049a1588bae07b2d0968c8d54db2f8ad637fd: Status 404 returned error can't find the container with id cdca9a63860f4149b8aa9828bce049a1588bae07b2d0968c8d54db2f8ad637fd Mar 19 20:16:56 crc kubenswrapper[4799]: I0319 20:16:56.032785 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb"] Mar 19 20:16:56 crc kubenswrapper[4799]: W0319 20:16:56.038750 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda471aa18_d5fa_455b_b8a6_395717db50b9.slice/crio-bfc83df1e4bad3f30a651cdb9cdd53b4f21dcbb44fd5056bbed365d5bf75ae02 WatchSource:0}: Error finding container bfc83df1e4bad3f30a651cdb9cdd53b4f21dcbb44fd5056bbed365d5bf75ae02: Status 404 returned error can't find the container with id bfc83df1e4bad3f30a651cdb9cdd53b4f21dcbb44fd5056bbed365d5bf75ae02 Mar 19 20:16:56 crc kubenswrapper[4799]: I0319 20:16:56.531635 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" event={"ID":"570da1bb-c2ff-40e5-a2b6-352d09168d6d","Type":"ContainerStarted","Data":"cdca9a63860f4149b8aa9828bce049a1588bae07b2d0968c8d54db2f8ad637fd"} Mar 19 20:16:56 crc kubenswrapper[4799]: I0319 20:16:56.533052 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fv54b" event={"ID":"5f8bca57-6368-4bd6-9d79-d0e640dd074f","Type":"ContainerStarted","Data":"37c9858ae7b5bbc3cf0fecc8ff125666751d63a16e187245af259b4a917ee9a7"} Mar 19 20:16:56 crc kubenswrapper[4799]: I0319 20:16:56.534618 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" event={"ID":"a471aa18-d5fa-455b-b8a6-395717db50b9","Type":"ContainerStarted","Data":"bfc83df1e4bad3f30a651cdb9cdd53b4f21dcbb44fd5056bbed365d5bf75ae02"} Mar 19 20:16:58 crc kubenswrapper[4799]: I0319 20:16:58.755695 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:16:58 crc kubenswrapper[4799]: I0319 20:16:58.756185 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.564035 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fv54b" event={"ID":"5f8bca57-6368-4bd6-9d79-d0e640dd074f","Type":"ContainerStarted","Data":"6fb43b76518e9b7e1dd4ca66827d9c04bba658c1046643b725bf35f16e2c7a7f"} Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.566222 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" event={"ID":"a471aa18-d5fa-455b-b8a6-395717db50b9","Type":"ContainerStarted","Data":"61fe738cf0a18d584ba77f08e5aa09e10ea3a4d7fb077c7b72082343d66a5228"} Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.568026 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" event={"ID":"570da1bb-c2ff-40e5-a2b6-352d09168d6d","Type":"ContainerStarted","Data":"ff84624f4f28b53cc0fd4ca716f80ac55b66bf203acd04419bf25b4733d22b6e"} Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.568234 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.588008 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fv54b" podStartSLOduration=1.9696214730000001 podStartE2EDuration="5.587981202s" podCreationTimestamp="2026-03-19 20:16:55 +0000 UTC" firstStartedPulling="2026-03-19 20:16:55.857164529 +0000 UTC m=+693.463117601" lastFinishedPulling="2026-03-19 20:16:59.475524248 +0000 UTC m=+697.081477330" observedRunningTime="2026-03-19 20:17:00.584882668 +0000 UTC m=+698.190835780" watchObservedRunningTime="2026-03-19 20:17:00.587981202 +0000 UTC m=+698.193934314" Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.609431 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" podStartSLOduration=2.026068571 podStartE2EDuration="5.609376675s" podCreationTimestamp="2026-03-19 20:16:55 +0000 UTC" firstStartedPulling="2026-03-19 20:16:55.893323667 +0000 UTC m=+693.499276739" lastFinishedPulling="2026-03-19 20:16:59.476631731 +0000 UTC m=+697.082584843" observedRunningTime="2026-03-19 20:17:00.607350544 +0000 UTC m=+698.213303656" watchObservedRunningTime="2026-03-19 20:17:00.609376675 +0000 UTC m=+698.215329777" Mar 19 20:17:00 crc kubenswrapper[4799]: I0319 20:17:00.634360 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9ppqb" podStartSLOduration=2.139614478 podStartE2EDuration="5.634341647s" podCreationTimestamp="2026-03-19 20:16:55 +0000 UTC" firstStartedPulling="2026-03-19 20:16:56.040705492 +0000 UTC m=+693.646658554" lastFinishedPulling="2026-03-19 20:16:59.535432641 +0000 UTC m=+697.141385723" observedRunningTime="2026-03-19 20:17:00.633149781 +0000 UTC m=+698.239102873" watchObservedRunningTime="2026-03-19 20:17:00.634341647 +0000 UTC m=+698.240294719" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.476967 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b2bc2"] Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.478506 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-controller" containerID="cri-o://082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.478593 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="nbdb" containerID="cri-o://728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.480264 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.480364 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="sbdb" containerID="cri-o://42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.480485 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-node" containerID="cri-o://8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.480522 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="northd" containerID="cri-o://c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.480552 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-acl-logging" containerID="cri-o://b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.513831 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovnkube-controller" containerID="cri-o://691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" gracePeriod=30 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.567363 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-jtk48" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.610355 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hgdvf_375732b9-7d32-4090-b9d0-f6168107436b/kube-multus/0.log" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.610421 4799 generic.go:334] "Generic (PLEG): container finished" podID="375732b9-7d32-4090-b9d0-f6168107436b" containerID="4e9b2a9be48f6d2a71959464f4aa0703755e22d46cc402516abccafb85ef1d93" exitCode=2 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.610484 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgdvf" event={"ID":"375732b9-7d32-4090-b9d0-f6168107436b","Type":"ContainerDied","Data":"4e9b2a9be48f6d2a71959464f4aa0703755e22d46cc402516abccafb85ef1d93"} Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.611649 4799 scope.go:117] "RemoveContainer" containerID="4e9b2a9be48f6d2a71959464f4aa0703755e22d46cc402516abccafb85ef1d93" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.624799 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2bc2_3c4f2665-70de-4a4f-85d2-c93b098c910a/ovn-acl-logging/0.log" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.625874 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2bc2_3c4f2665-70de-4a4f-85d2-c93b098c910a/ovn-controller/0.log" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.626342 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" exitCode=0 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.626375 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" exitCode=143 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.626408 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" exitCode=143 Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.626437 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.626499 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.626521 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.790472 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2bc2_3c4f2665-70de-4a4f-85d2-c93b098c910a/ovn-acl-logging/0.log" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.791739 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2bc2_3c4f2665-70de-4a4f-85d2-c93b098c910a/ovn-controller/0.log" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.792556 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.861301 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fgkjs"] Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.861907 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862028 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862082 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-controller" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862096 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-controller" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862121 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovnkube-controller" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862134 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovnkube-controller" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862156 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-acl-logging" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862167 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-acl-logging" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862179 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-node" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862192 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-node" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862217 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kubecfg-setup" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862227 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kubecfg-setup" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862248 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="sbdb" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862259 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="sbdb" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862291 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="northd" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862304 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="northd" Mar 19 20:17:05 crc kubenswrapper[4799]: E0319 20:17:05.862318 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="nbdb" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862328 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="nbdb" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862690 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="northd" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862727 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovnkube-controller" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862742 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="nbdb" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862764 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="sbdb" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862780 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862801 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-controller" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862815 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="kube-rbac-proxy-node" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.862837 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerName="ovn-acl-logging" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.867802 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970789 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-kubelet\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970848 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-systemd-units\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970871 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-systemd\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970906 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovn-node-metrics-cert\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970931 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-node-log\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970959 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzc9c\" (UniqueName: \"kubernetes.io/projected/3c4f2665-70de-4a4f-85d2-c93b098c910a-kube-api-access-bzc9c\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.970947 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971008 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-ovn\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971141 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-config\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971196 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-node-log" (OuterVolumeSpecName: "node-log") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971231 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-env-overrides\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971189 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971557 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971674 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-openvswitch\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971743 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-slash\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971771 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-bin\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971801 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-netns\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971788 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971851 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971845 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971825 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971827 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971819 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-slash" (OuterVolumeSpecName: "host-slash") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971888 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971908 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-log-socket\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971935 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-var-lib-openvswitch\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971937 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-log-socket" (OuterVolumeSpecName: "log-socket") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971967 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-script-lib\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971986 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-netd\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972002 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-ovn-kubernetes\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971972 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.971991 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972047 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972027 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-etc-openvswitch\") pod \"3c4f2665-70de-4a4f-85d2-c93b098c910a\" (UID: \"3c4f2665-70de-4a4f-85d2-c93b098c910a\") " Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972017 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972044 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972283 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovnkube-script-lib\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972325 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-run-netns\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972359 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-systemd-units\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972519 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-env-overrides\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972654 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-var-lib-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972689 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-cni-bin\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972694 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972783 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovn-node-metrics-cert\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972820 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-systemd\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972850 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovnkube-config\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972883 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972916 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-slash\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972954 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp68d\" (UniqueName: \"kubernetes.io/projected/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-kube-api-access-vp68d\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.972983 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-node-log\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973011 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-etc-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973042 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973072 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-kubelet\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973103 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-cni-netd\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973140 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973174 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-log-socket\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973206 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-ovn\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973286 4799 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973306 4799 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973324 4799 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973339 4799 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973354 4799 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973368 4799 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973407 4799 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973425 4799 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973440 4799 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973568 4799 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973607 4799 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973626 4799 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973645 4799 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973666 4799 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973686 4799 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973723 4799 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.973749 4799 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.979413 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.981126 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4f2665-70de-4a4f-85d2-c93b098c910a-kube-api-access-bzc9c" (OuterVolumeSpecName: "kube-api-access-bzc9c") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "kube-api-access-bzc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:17:05 crc kubenswrapper[4799]: I0319 20:17:05.993559 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3c4f2665-70de-4a4f-85d2-c93b098c910a" (UID: "3c4f2665-70de-4a4f-85d2-c93b098c910a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.074866 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovn-node-metrics-cert\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.074965 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-systemd\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075018 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovnkube-config\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075073 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075083 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-systemd\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075127 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-slash\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075196 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp68d\" (UniqueName: \"kubernetes.io/projected/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-kube-api-access-vp68d\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075207 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075241 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-node-log\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075321 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-etc-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075349 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-slash\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075421 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075363 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075468 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-etc-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075321 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-node-log\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075511 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-kubelet\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075544 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-kubelet\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075579 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-cni-netd\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075624 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075662 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-log-socket\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075684 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075702 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-ovn\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075706 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-cni-netd\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075738 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-log-socket\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075740 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovnkube-script-lib\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075787 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-run-netns\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075820 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-systemd-units\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075823 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-run-netns\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075860 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-systemd-units\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075787 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-run-ovn\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.075882 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-env-overrides\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076214 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-var-lib-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076254 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-cni-bin\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076357 4799 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3c4f2665-70de-4a4f-85d2-c93b098c910a-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076372 4799 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3c4f2665-70de-4a4f-85d2-c93b098c910a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076434 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzc9c\" (UniqueName: \"kubernetes.io/projected/3c4f2665-70de-4a4f-85d2-c93b098c910a-kube-api-access-bzc9c\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076444 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-var-lib-openvswitch\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076340 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovnkube-config\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076477 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-host-cni-bin\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.076950 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-env-overrides\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.079935 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovnkube-script-lib\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.081333 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-ovn-node-metrics-cert\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.104737 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp68d\" (UniqueName: \"kubernetes.io/projected/2ff929ef-9fb5-49a5-a8e3-7820290a9b1f-kube-api-access-vp68d\") pod \"ovnkube-node-fgkjs\" (UID: \"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.192140 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:06 crc kubenswrapper[4799]: W0319 20:17:06.223869 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ff929ef_9fb5_49a5_a8e3_7820290a9b1f.slice/crio-0846fb9adfc1da8d1a0ebc7253ff73206073c3bc65d311de732624b7f0a7c330 WatchSource:0}: Error finding container 0846fb9adfc1da8d1a0ebc7253ff73206073c3bc65d311de732624b7f0a7c330: Status 404 returned error can't find the container with id 0846fb9adfc1da8d1a0ebc7253ff73206073c3bc65d311de732624b7f0a7c330 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.636084 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ff929ef-9fb5-49a5-a8e3-7820290a9b1f" containerID="7238aef8a6853a6e78a6c706999798a216789cc156ec98d223ae4bf2370cec78" exitCode=0 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.636204 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerDied","Data":"7238aef8a6853a6e78a6c706999798a216789cc156ec98d223ae4bf2370cec78"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.636270 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"0846fb9adfc1da8d1a0ebc7253ff73206073c3bc65d311de732624b7f0a7c330"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.640149 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hgdvf_375732b9-7d32-4090-b9d0-f6168107436b/kube-multus/0.log" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.640281 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgdvf" event={"ID":"375732b9-7d32-4090-b9d0-f6168107436b","Type":"ContainerStarted","Data":"bc863dc6b406cef7493c334bbcfd72828f4fec1c2b24ad77311822c96a739d54"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.651500 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2bc2_3c4f2665-70de-4a4f-85d2-c93b098c910a/ovn-acl-logging/0.log" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.652460 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2bc2_3c4f2665-70de-4a4f-85d2-c93b098c910a/ovn-controller/0.log" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653225 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" exitCode=0 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653264 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" exitCode=0 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653279 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" exitCode=0 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653292 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" exitCode=0 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653306 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c4f2665-70de-4a4f-85d2-c93b098c910a" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" exitCode=0 Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653786 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653860 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653900 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653928 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653953 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.653979 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" event={"ID":"3c4f2665-70de-4a4f-85d2-c93b098c910a","Type":"ContainerDied","Data":"c24142686369c05044686930e821e2347fba48e9764de3de0dd6fd9935f1bc64"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.654005 4799 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.654028 4799 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.654042 4799 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34"} Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.654070 4799 scope.go:117] "RemoveContainer" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.654433 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b2bc2" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.687995 4799 scope.go:117] "RemoveContainer" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.713599 4799 scope.go:117] "RemoveContainer" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.750297 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b2bc2"] Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.755475 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b2bc2"] Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.762334 4799 scope.go:117] "RemoveContainer" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.797737 4799 scope.go:117] "RemoveContainer" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.814173 4799 scope.go:117] "RemoveContainer" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.828234 4799 scope.go:117] "RemoveContainer" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.862216 4799 scope.go:117] "RemoveContainer" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.914011 4799 scope.go:117] "RemoveContainer" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.931144 4799 scope.go:117] "RemoveContainer" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.931727 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": container with ID starting with 691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635 not found: ID does not exist" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.931873 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} err="failed to get container status \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": rpc error: code = NotFound desc = could not find container \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": container with ID starting with 691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.931991 4799 scope.go:117] "RemoveContainer" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.932555 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": container with ID starting with 42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6 not found: ID does not exist" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.932665 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} err="failed to get container status \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": rpc error: code = NotFound desc = could not find container \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": container with ID starting with 42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.932748 4799 scope.go:117] "RemoveContainer" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.933206 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": container with ID starting with 728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082 not found: ID does not exist" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.933247 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} err="failed to get container status \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": rpc error: code = NotFound desc = could not find container \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": container with ID starting with 728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.933279 4799 scope.go:117] "RemoveContainer" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.933848 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": container with ID starting with c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b not found: ID does not exist" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.933968 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} err="failed to get container status \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": rpc error: code = NotFound desc = could not find container \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": container with ID starting with c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.934074 4799 scope.go:117] "RemoveContainer" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.934549 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": container with ID starting with d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e not found: ID does not exist" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.934586 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} err="failed to get container status \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": rpc error: code = NotFound desc = could not find container \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": container with ID starting with d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.934606 4799 scope.go:117] "RemoveContainer" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.934842 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": container with ID starting with 8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130 not found: ID does not exist" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.934876 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} err="failed to get container status \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": rpc error: code = NotFound desc = could not find container \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": container with ID starting with 8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.934893 4799 scope.go:117] "RemoveContainer" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.935181 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": container with ID starting with b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032 not found: ID does not exist" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.935284 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} err="failed to get container status \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": rpc error: code = NotFound desc = could not find container \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": container with ID starting with b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.935399 4799 scope.go:117] "RemoveContainer" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.935967 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": container with ID starting with 082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421 not found: ID does not exist" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.936065 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} err="failed to get container status \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": rpc error: code = NotFound desc = could not find container \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": container with ID starting with 082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.936158 4799 scope.go:117] "RemoveContainer" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" Mar 19 20:17:06 crc kubenswrapper[4799]: E0319 20:17:06.937472 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": container with ID starting with 1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34 not found: ID does not exist" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.937612 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34"} err="failed to get container status \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": rpc error: code = NotFound desc = could not find container \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": container with ID starting with 1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.937719 4799 scope.go:117] "RemoveContainer" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.938275 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} err="failed to get container status \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": rpc error: code = NotFound desc = could not find container \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": container with ID starting with 691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.938360 4799 scope.go:117] "RemoveContainer" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.945345 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} err="failed to get container status \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": rpc error: code = NotFound desc = could not find container \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": container with ID starting with 42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.945435 4799 scope.go:117] "RemoveContainer" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.947120 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} err="failed to get container status \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": rpc error: code = NotFound desc = could not find container \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": container with ID starting with 728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.947252 4799 scope.go:117] "RemoveContainer" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.948089 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} err="failed to get container status \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": rpc error: code = NotFound desc = could not find container \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": container with ID starting with c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.948160 4799 scope.go:117] "RemoveContainer" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.948658 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} err="failed to get container status \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": rpc error: code = NotFound desc = could not find container \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": container with ID starting with d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.948683 4799 scope.go:117] "RemoveContainer" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.949078 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} err="failed to get container status \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": rpc error: code = NotFound desc = could not find container \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": container with ID starting with 8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.949404 4799 scope.go:117] "RemoveContainer" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.949918 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} err="failed to get container status \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": rpc error: code = NotFound desc = could not find container \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": container with ID starting with b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.949939 4799 scope.go:117] "RemoveContainer" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.950289 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} err="failed to get container status \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": rpc error: code = NotFound desc = could not find container \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": container with ID starting with 082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.950348 4799 scope.go:117] "RemoveContainer" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.950806 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34"} err="failed to get container status \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": rpc error: code = NotFound desc = could not find container \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": container with ID starting with 1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.950904 4799 scope.go:117] "RemoveContainer" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.951359 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} err="failed to get container status \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": rpc error: code = NotFound desc = could not find container \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": container with ID starting with 691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.951409 4799 scope.go:117] "RemoveContainer" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.951879 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} err="failed to get container status \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": rpc error: code = NotFound desc = could not find container \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": container with ID starting with 42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.951908 4799 scope.go:117] "RemoveContainer" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.952231 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} err="failed to get container status \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": rpc error: code = NotFound desc = could not find container \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": container with ID starting with 728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.952254 4799 scope.go:117] "RemoveContainer" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.952575 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} err="failed to get container status \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": rpc error: code = NotFound desc = could not find container \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": container with ID starting with c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.952610 4799 scope.go:117] "RemoveContainer" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.952950 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} err="failed to get container status \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": rpc error: code = NotFound desc = could not find container \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": container with ID starting with d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.952992 4799 scope.go:117] "RemoveContainer" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.953448 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} err="failed to get container status \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": rpc error: code = NotFound desc = could not find container \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": container with ID starting with 8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.953474 4799 scope.go:117] "RemoveContainer" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.953857 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} err="failed to get container status \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": rpc error: code = NotFound desc = could not find container \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": container with ID starting with b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.953890 4799 scope.go:117] "RemoveContainer" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.954262 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} err="failed to get container status \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": rpc error: code = NotFound desc = could not find container \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": container with ID starting with 082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.954317 4799 scope.go:117] "RemoveContainer" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.954670 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34"} err="failed to get container status \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": rpc error: code = NotFound desc = could not find container \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": container with ID starting with 1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.954699 4799 scope.go:117] "RemoveContainer" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.955029 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} err="failed to get container status \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": rpc error: code = NotFound desc = could not find container \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": container with ID starting with 691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.955064 4799 scope.go:117] "RemoveContainer" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.955366 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} err="failed to get container status \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": rpc error: code = NotFound desc = could not find container \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": container with ID starting with 42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.955415 4799 scope.go:117] "RemoveContainer" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.955752 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} err="failed to get container status \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": rpc error: code = NotFound desc = could not find container \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": container with ID starting with 728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.955778 4799 scope.go:117] "RemoveContainer" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.956159 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} err="failed to get container status \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": rpc error: code = NotFound desc = could not find container \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": container with ID starting with c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.956178 4799 scope.go:117] "RemoveContainer" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.956433 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} err="failed to get container status \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": rpc error: code = NotFound desc = could not find container \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": container with ID starting with d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.956463 4799 scope.go:117] "RemoveContainer" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.956837 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} err="failed to get container status \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": rpc error: code = NotFound desc = could not find container \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": container with ID starting with 8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.956882 4799 scope.go:117] "RemoveContainer" containerID="b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.957178 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032"} err="failed to get container status \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": rpc error: code = NotFound desc = could not find container \"b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032\": container with ID starting with b0e826925da67dbea1ddebb2559151dea725b88ebe5e2883c2f64c47c150b032 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.957200 4799 scope.go:117] "RemoveContainer" containerID="082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.957595 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421"} err="failed to get container status \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": rpc error: code = NotFound desc = could not find container \"082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421\": container with ID starting with 082fddcaeb958e06a5456d747ab5c8db167720220207bc8a3d865959a3405421 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.957636 4799 scope.go:117] "RemoveContainer" containerID="1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.957980 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34"} err="failed to get container status \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": rpc error: code = NotFound desc = could not find container \"1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34\": container with ID starting with 1c5b57f0daf5991d1b6679e1fc5b975a3c2cb6266487a9924644657185aa5c34 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958018 4799 scope.go:117] "RemoveContainer" containerID="691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958310 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635"} err="failed to get container status \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": rpc error: code = NotFound desc = could not find container \"691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635\": container with ID starting with 691f4f559a0f94f52bb160d966056dbb604c456faedd86f18168e2f20f157635 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958353 4799 scope.go:117] "RemoveContainer" containerID="42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958619 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6"} err="failed to get container status \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": rpc error: code = NotFound desc = could not find container \"42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6\": container with ID starting with 42ce8ef814158eaaa4e892bae26b957268ddc740ebb17c0c6b468f0129afeac6 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958657 4799 scope.go:117] "RemoveContainer" containerID="728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958918 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082"} err="failed to get container status \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": rpc error: code = NotFound desc = could not find container \"728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082\": container with ID starting with 728e91166d45b9a9bbc40fa21b3b964f6df5d748fc1704353515a96e3918d082 not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.958952 4799 scope.go:117] "RemoveContainer" containerID="c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.959281 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b"} err="failed to get container status \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": rpc error: code = NotFound desc = could not find container \"c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b\": container with ID starting with c6814103f114b30ff361162b90f522f24910ef82e8b1dc11fa8f92046c1d359b not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.959312 4799 scope.go:117] "RemoveContainer" containerID="d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.959713 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e"} err="failed to get container status \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": rpc error: code = NotFound desc = could not find container \"d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e\": container with ID starting with d98f4194a7fcdd7624263d5d8d71d2bcd31636b2adcc523dc9ffd44c58ed921e not found: ID does not exist" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.959748 4799 scope.go:117] "RemoveContainer" containerID="8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130" Mar 19 20:17:06 crc kubenswrapper[4799]: I0319 20:17:06.960148 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130"} err="failed to get container status \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": rpc error: code = NotFound desc = could not find container \"8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130\": container with ID starting with 8825967ea1736e23b892bc8852513d10b1b68535dc002073c81a6f6830128130 not found: ID does not exist" Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.125215 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4f2665-70de-4a4f-85d2-c93b098c910a" path="/var/lib/kubelet/pods/3c4f2665-70de-4a4f-85d2-c93b098c910a/volumes" Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.665174 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"5daf835441e5a32aacdc2ca9a7aaa22d558dc9d6c9ded117b22357326a5093b7"} Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.665562 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"484e3e492c29247ac738533ce03a7cd98681f2591a1a7ccd81be03d6d612eac2"} Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.665578 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"a0dbddb36c23201bb5344e412639ef3392b8817f259bc4033079e1e0858f3973"} Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.665592 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"b85839b9992035f866428ae7d5dc8d817e08f1dc8c65f86c05e5a287112ab456"} Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.665602 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"26a0969a89bca3ff7bdb30f2180c55e018a7367135bb5d0f6e6b4bcae34d94a9"} Mar 19 20:17:07 crc kubenswrapper[4799]: I0319 20:17:07.665612 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"c7d7f3b2a272963b7934eee89b68fb40293165f910d0ef1c0b65e307b03ddf60"} Mar 19 20:17:10 crc kubenswrapper[4799]: I0319 20:17:10.692792 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"a687b3e3d4d16956b5e937beefed3453647ebcb90d936f7e9b96bfa8ae6ed7d9"} Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.714638 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" event={"ID":"2ff929ef-9fb5-49a5-a8e3-7820290a9b1f","Type":"ContainerStarted","Data":"b1866ca0cb70e52093e44fd331fa0d2dceed3e7e232f526da29855dc4a807226"} Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.715328 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.715421 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.715451 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.766042 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" podStartSLOduration=7.766012927 podStartE2EDuration="7.766012927s" podCreationTimestamp="2026-03-19 20:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:17:12.761960005 +0000 UTC m=+710.367913107" watchObservedRunningTime="2026-03-19 20:17:12.766012927 +0000 UTC m=+710.371966039" Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.813250 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:12 crc kubenswrapper[4799]: I0319 20:17:12.814081 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:28 crc kubenswrapper[4799]: I0319 20:17:28.756271 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:17:28 crc kubenswrapper[4799]: I0319 20:17:28.757181 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:17:28 crc kubenswrapper[4799]: I0319 20:17:28.757252 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:17:28 crc kubenswrapper[4799]: I0319 20:17:28.758420 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74d053865b82ea1997ab8189b4caa9e61f0e0eecb338b66d10379da64224a7e7"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:17:28 crc kubenswrapper[4799]: I0319 20:17:28.758623 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://74d053865b82ea1997ab8189b4caa9e61f0e0eecb338b66d10379da64224a7e7" gracePeriod=600 Mar 19 20:17:29 crc kubenswrapper[4799]: I0319 20:17:29.842148 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="74d053865b82ea1997ab8189b4caa9e61f0e0eecb338b66d10379da64224a7e7" exitCode=0 Mar 19 20:17:29 crc kubenswrapper[4799]: I0319 20:17:29.842222 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"74d053865b82ea1997ab8189b4caa9e61f0e0eecb338b66d10379da64224a7e7"} Mar 19 20:17:29 crc kubenswrapper[4799]: I0319 20:17:29.843123 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"13b6fab9ed6c0d9132855fd64438b4c86e33e17491f418536e545f20790a7c7a"} Mar 19 20:17:29 crc kubenswrapper[4799]: I0319 20:17:29.843175 4799 scope.go:117] "RemoveContainer" containerID="7a1a2f02f9c1b5d698b1cb1a5c79bd672a914335d5d58110be6e7d8de1577df2" Mar 19 20:17:36 crc kubenswrapper[4799]: I0319 20:17:36.224581 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fgkjs" Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.822987 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f"] Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.826231 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.828309 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.834701 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f"] Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.928128 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.928190 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpql\" (UniqueName: \"kubernetes.io/projected/527f6060-14f0-48e5-b8a9-4fc91d1775a6-kube-api-access-hxpql\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:43 crc kubenswrapper[4799]: I0319 20:17:43.928238 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.030142 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.030584 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.030696 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpql\" (UniqueName: \"kubernetes.io/projected/527f6060-14f0-48e5-b8a9-4fc91d1775a6-kube-api-access-hxpql\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.031281 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.031324 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.066949 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpql\" (UniqueName: \"kubernetes.io/projected/527f6060-14f0-48e5-b8a9-4fc91d1775a6-kube-api-access-hxpql\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.147511 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.398372 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f"] Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.961495 4799 generic.go:334] "Generic (PLEG): container finished" podID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerID="63d15cd30d6ea063a9ba75ded15b6690050c27f0f19c6be508b166cec5adddff" exitCode=0 Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.961591 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" event={"ID":"527f6060-14f0-48e5-b8a9-4fc91d1775a6","Type":"ContainerDied","Data":"63d15cd30d6ea063a9ba75ded15b6690050c27f0f19c6be508b166cec5adddff"} Mar 19 20:17:44 crc kubenswrapper[4799]: I0319 20:17:44.961785 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" event={"ID":"527f6060-14f0-48e5-b8a9-4fc91d1775a6","Type":"ContainerStarted","Data":"2655ee2c85e7991360cfae19a1acb41c291586a08b3b48ea379ccfa070f8f659"} Mar 19 20:17:46 crc kubenswrapper[4799]: I0319 20:17:46.979183 4799 generic.go:334] "Generic (PLEG): container finished" podID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerID="c6aa2f32bb334fa9b18d6e0ce64088f63588fa1076534d6cc246c9508958791c" exitCode=0 Mar 19 20:17:46 crc kubenswrapper[4799]: I0319 20:17:46.979292 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" event={"ID":"527f6060-14f0-48e5-b8a9-4fc91d1775a6","Type":"ContainerDied","Data":"c6aa2f32bb334fa9b18d6e0ce64088f63588fa1076534d6cc246c9508958791c"} Mar 19 20:17:47 crc kubenswrapper[4799]: I0319 20:17:47.989096 4799 generic.go:334] "Generic (PLEG): container finished" podID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerID="453524b266db0ed550f8334d4d16226dd082958605898693d1c6d51957023070" exitCode=0 Mar 19 20:17:47 crc kubenswrapper[4799]: I0319 20:17:47.989140 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" event={"ID":"527f6060-14f0-48e5-b8a9-4fc91d1775a6","Type":"ContainerDied","Data":"453524b266db0ed550f8334d4d16226dd082958605898693d1c6d51957023070"} Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.328459 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.512610 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-bundle\") pod \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.512786 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxpql\" (UniqueName: \"kubernetes.io/projected/527f6060-14f0-48e5-b8a9-4fc91d1775a6-kube-api-access-hxpql\") pod \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.512841 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-util\") pod \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\" (UID: \"527f6060-14f0-48e5-b8a9-4fc91d1775a6\") " Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.514002 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-bundle" (OuterVolumeSpecName: "bundle") pod "527f6060-14f0-48e5-b8a9-4fc91d1775a6" (UID: "527f6060-14f0-48e5-b8a9-4fc91d1775a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.527210 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527f6060-14f0-48e5-b8a9-4fc91d1775a6-kube-api-access-hxpql" (OuterVolumeSpecName: "kube-api-access-hxpql") pod "527f6060-14f0-48e5-b8a9-4fc91d1775a6" (UID: "527f6060-14f0-48e5-b8a9-4fc91d1775a6"). InnerVolumeSpecName "kube-api-access-hxpql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.527684 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-util" (OuterVolumeSpecName: "util") pod "527f6060-14f0-48e5-b8a9-4fc91d1775a6" (UID: "527f6060-14f0-48e5-b8a9-4fc91d1775a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.615007 4799 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.615057 4799 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/527f6060-14f0-48e5-b8a9-4fc91d1775a6-util\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:49 crc kubenswrapper[4799]: I0319 20:17:49.615067 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxpql\" (UniqueName: \"kubernetes.io/projected/527f6060-14f0-48e5-b8a9-4fc91d1775a6-kube-api-access-hxpql\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:50 crc kubenswrapper[4799]: I0319 20:17:50.015560 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" event={"ID":"527f6060-14f0-48e5-b8a9-4fc91d1775a6","Type":"ContainerDied","Data":"2655ee2c85e7991360cfae19a1acb41c291586a08b3b48ea379ccfa070f8f659"} Mar 19 20:17:50 crc kubenswrapper[4799]: I0319 20:17:50.015620 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2655ee2c85e7991360cfae19a1acb41c291586a08b3b48ea379ccfa070f8f659" Mar 19 20:17:50 crc kubenswrapper[4799]: I0319 20:17:50.015723 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.800054 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-29txn"] Mar 19 20:17:51 crc kubenswrapper[4799]: E0319 20:17:51.800634 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="util" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.800650 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="util" Mar 19 20:17:51 crc kubenswrapper[4799]: E0319 20:17:51.800666 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="extract" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.800674 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="extract" Mar 19 20:17:51 crc kubenswrapper[4799]: E0319 20:17:51.800693 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="pull" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.800701 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="pull" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.800811 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="527f6060-14f0-48e5-b8a9-4fc91d1775a6" containerName="extract" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.801213 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.802871 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.803183 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.803415 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2tgvn" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.818140 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-29txn"] Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.852994 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqqc\" (UniqueName: \"kubernetes.io/projected/44b60556-07ce-4245-a994-dded304e075b-kube-api-access-lhqqc\") pod \"nmstate-operator-796d4cfff4-29txn\" (UID: \"44b60556-07ce-4245-a994-dded304e075b\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.953614 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqqc\" (UniqueName: \"kubernetes.io/projected/44b60556-07ce-4245-a994-dded304e075b-kube-api-access-lhqqc\") pod \"nmstate-operator-796d4cfff4-29txn\" (UID: \"44b60556-07ce-4245-a994-dded304e075b\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" Mar 19 20:17:51 crc kubenswrapper[4799]: I0319 20:17:51.990262 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqqc\" (UniqueName: \"kubernetes.io/projected/44b60556-07ce-4245-a994-dded304e075b-kube-api-access-lhqqc\") pod \"nmstate-operator-796d4cfff4-29txn\" (UID: \"44b60556-07ce-4245-a994-dded304e075b\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" Mar 19 20:17:52 crc kubenswrapper[4799]: I0319 20:17:52.115994 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" Mar 19 20:17:52 crc kubenswrapper[4799]: I0319 20:17:52.603309 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-29txn"] Mar 19 20:17:52 crc kubenswrapper[4799]: W0319 20:17:52.612716 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b60556_07ce_4245_a994_dded304e075b.slice/crio-bb43e89b1aa034548a551db16be0092d4c09c9b4571430017619faff2a19ef11 WatchSource:0}: Error finding container bb43e89b1aa034548a551db16be0092d4c09c9b4571430017619faff2a19ef11: Status 404 returned error can't find the container with id bb43e89b1aa034548a551db16be0092d4c09c9b4571430017619faff2a19ef11 Mar 19 20:17:53 crc kubenswrapper[4799]: I0319 20:17:53.033310 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" event={"ID":"44b60556-07ce-4245-a994-dded304e075b","Type":"ContainerStarted","Data":"bb43e89b1aa034548a551db16be0092d4c09c9b4571430017619faff2a19ef11"} Mar 19 20:17:56 crc kubenswrapper[4799]: I0319 20:17:56.050922 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" event={"ID":"44b60556-07ce-4245-a994-dded304e075b","Type":"ContainerStarted","Data":"0189753abc1ac6077a1a5907f7601944a95b4d21d35fae62142b0b5eb7792bcd"} Mar 19 20:17:56 crc kubenswrapper[4799]: I0319 20:17:56.089841 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-29txn" podStartSLOduration=2.643329704 podStartE2EDuration="5.089816074s" podCreationTimestamp="2026-03-19 20:17:51 +0000 UTC" firstStartedPulling="2026-03-19 20:17:52.616573185 +0000 UTC m=+750.222526297" lastFinishedPulling="2026-03-19 20:17:55.063059595 +0000 UTC m=+752.669012667" observedRunningTime="2026-03-19 20:17:56.068402944 +0000 UTC m=+753.674356036" watchObservedRunningTime="2026-03-19 20:17:56.089816074 +0000 UTC m=+753.695769156" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.069592 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.070956 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.088533 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.093093 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-psznr"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.094077 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.095581 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-m5r8b" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.100042 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.124210 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qqjcl"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.125020 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.131458 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-psznr"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219180 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj5zg\" (UniqueName: \"kubernetes.io/projected/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-kube-api-access-hj5zg\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219224 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-dbus-socket\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219251 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-nmstate-lock\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219347 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-ovs-socket\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219391 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219495 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r49qd\" (UniqueName: \"kubernetes.io/projected/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-kube-api-access-r49qd\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.219622 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2p5\" (UniqueName: \"kubernetes.io/projected/dab69c67-b7fc-4f89-93c6-6ee825d89b7d-kube-api-access-sz2p5\") pod \"nmstate-metrics-9b8c8685d-qzf64\" (UID: \"dab69c67-b7fc-4f89-93c6-6ee825d89b7d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.221432 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.222018 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.223628 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.223666 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.224036 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-sb644" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.234868 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321150 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r49qd\" (UniqueName: \"kubernetes.io/projected/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-kube-api-access-r49qd\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321208 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/540372f2-ca9d-47b0-aaa6-86831627cd8e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321243 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2p5\" (UniqueName: \"kubernetes.io/projected/dab69c67-b7fc-4f89-93c6-6ee825d89b7d-kube-api-access-sz2p5\") pod \"nmstate-metrics-9b8c8685d-qzf64\" (UID: \"dab69c67-b7fc-4f89-93c6-6ee825d89b7d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321268 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmz6\" (UniqueName: \"kubernetes.io/projected/540372f2-ca9d-47b0-aaa6-86831627cd8e-kube-api-access-sfmz6\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321299 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj5zg\" (UniqueName: \"kubernetes.io/projected/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-kube-api-access-hj5zg\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321318 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-dbus-socket\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321491 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-nmstate-lock\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321558 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-dbus-socket\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321557 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-ovs-socket\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321601 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321610 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-ovs-socket\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321630 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/540372f2-ca9d-47b0-aaa6-86831627cd8e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.321597 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-nmstate-lock\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: E0319 20:17:57.321655 4799 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 20:17:57 crc kubenswrapper[4799]: E0319 20:17:57.321701 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-tls-key-pair podName:dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6 nodeName:}" failed. No retries permitted until 2026-03-19 20:17:57.821687727 +0000 UTC m=+755.427640799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-tls-key-pair") pod "nmstate-webhook-5f558f5558-psznr" (UID: "dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6") : secret "openshift-nmstate-webhook" not found Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.353442 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r49qd\" (UniqueName: \"kubernetes.io/projected/7c8d90dd-d173-4fd7-a3ae-ed312bc20861-kube-api-access-r49qd\") pod \"nmstate-handler-qqjcl\" (UID: \"7c8d90dd-d173-4fd7-a3ae-ed312bc20861\") " pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.357490 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj5zg\" (UniqueName: \"kubernetes.io/projected/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-kube-api-access-hj5zg\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.358407 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2p5\" (UniqueName: \"kubernetes.io/projected/dab69c67-b7fc-4f89-93c6-6ee825d89b7d-kube-api-access-sz2p5\") pod \"nmstate-metrics-9b8c8685d-qzf64\" (UID: \"dab69c67-b7fc-4f89-93c6-6ee825d89b7d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.387902 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.417216 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d94bf8496-2rhdn"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.417894 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.424035 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/540372f2-ca9d-47b0-aaa6-86831627cd8e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.424113 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmz6\" (UniqueName: \"kubernetes.io/projected/540372f2-ca9d-47b0-aaa6-86831627cd8e-kube-api-access-sfmz6\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.424192 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/540372f2-ca9d-47b0-aaa6-86831627cd8e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.426035 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/540372f2-ca9d-47b0-aaa6-86831627cd8e-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.439976 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/540372f2-ca9d-47b0-aaa6-86831627cd8e-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.440969 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.450068 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmz6\" (UniqueName: \"kubernetes.io/projected/540372f2-ca9d-47b0-aaa6-86831627cd8e-kube-api-access-sfmz6\") pod \"nmstate-console-plugin-86f58fcf4-9s6cx\" (UID: \"540372f2-ca9d-47b0-aaa6-86831627cd8e\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.464115 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d94bf8496-2rhdn"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525144 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f918f598-05f1-40b6-9627-4ba33ccf997d-console-serving-cert\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525331 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f918f598-05f1-40b6-9627-4ba33ccf997d-console-oauth-config\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525351 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-service-ca\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525373 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-oauth-serving-cert\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525428 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqwx\" (UniqueName: \"kubernetes.io/projected/f918f598-05f1-40b6-9627-4ba33ccf997d-kube-api-access-cpqwx\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525473 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-console-config\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.525501 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-trusted-ca-bundle\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.536169 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.625421 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64"] Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626543 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f918f598-05f1-40b6-9627-4ba33ccf997d-console-serving-cert\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626597 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f918f598-05f1-40b6-9627-4ba33ccf997d-console-oauth-config\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626615 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-service-ca\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626642 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-oauth-serving-cert\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626678 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqwx\" (UniqueName: \"kubernetes.io/projected/f918f598-05f1-40b6-9627-4ba33ccf997d-kube-api-access-cpqwx\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626724 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-console-config\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.626752 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-trusted-ca-bundle\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.628316 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-trusted-ca-bundle\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.628548 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-service-ca\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.628880 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-console-config\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.628990 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f918f598-05f1-40b6-9627-4ba33ccf997d-oauth-serving-cert\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.632374 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f918f598-05f1-40b6-9627-4ba33ccf997d-console-serving-cert\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.634578 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f918f598-05f1-40b6-9627-4ba33ccf997d-console-oauth-config\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: W0319 20:17:57.641724 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddab69c67_b7fc_4f89_93c6_6ee825d89b7d.slice/crio-f56745d2281692a1e844a4cf9166fe40ff5978bf7a6995d0247a29e143465522 WatchSource:0}: Error finding container f56745d2281692a1e844a4cf9166fe40ff5978bf7a6995d0247a29e143465522: Status 404 returned error can't find the container with id f56745d2281692a1e844a4cf9166fe40ff5978bf7a6995d0247a29e143465522 Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.645188 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqwx\" (UniqueName: \"kubernetes.io/projected/f918f598-05f1-40b6-9627-4ba33ccf997d-kube-api-access-cpqwx\") pod \"console-5d94bf8496-2rhdn\" (UID: \"f918f598-05f1-40b6-9627-4ba33ccf997d\") " pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.719958 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx"] Mar 19 20:17:57 crc kubenswrapper[4799]: W0319 20:17:57.728547 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod540372f2_ca9d_47b0_aaa6_86831627cd8e.slice/crio-b74c388ebfa38a2563e876a0468acbcad30c08f50b33ae14d0f5e17b5a4bd1c7 WatchSource:0}: Error finding container b74c388ebfa38a2563e876a0468acbcad30c08f50b33ae14d0f5e17b5a4bd1c7: Status 404 returned error can't find the container with id b74c388ebfa38a2563e876a0468acbcad30c08f50b33ae14d0f5e17b5a4bd1c7 Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.824849 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.829466 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:57 crc kubenswrapper[4799]: I0319 20:17:57.832668 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-psznr\" (UID: \"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:58 crc kubenswrapper[4799]: I0319 20:17:58.008825 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:17:58 crc kubenswrapper[4799]: I0319 20:17:58.062920 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" event={"ID":"dab69c67-b7fc-4f89-93c6-6ee825d89b7d","Type":"ContainerStarted","Data":"f56745d2281692a1e844a4cf9166fe40ff5978bf7a6995d0247a29e143465522"} Mar 19 20:17:58 crc kubenswrapper[4799]: I0319 20:17:58.064311 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qqjcl" event={"ID":"7c8d90dd-d173-4fd7-a3ae-ed312bc20861","Type":"ContainerStarted","Data":"8a2aa0279297a0feb323c7db0b38bf919a0966e35643a347803f41488957be51"} Mar 19 20:17:58 crc kubenswrapper[4799]: I0319 20:17:58.065415 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" event={"ID":"540372f2-ca9d-47b0-aaa6-86831627cd8e","Type":"ContainerStarted","Data":"b74c388ebfa38a2563e876a0468acbcad30c08f50b33ae14d0f5e17b5a4bd1c7"} Mar 19 20:17:58 crc kubenswrapper[4799]: I0319 20:17:58.262252 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-psznr"] Mar 19 20:17:58 crc kubenswrapper[4799]: W0319 20:17:58.263078 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff785fc_6dbc_40bf_a5b3_d950ed4cb6e6.slice/crio-a8e100466baf562e62abf3bfd2bcbe43526f6c4b0103aaaa8557e437c235a01e WatchSource:0}: Error finding container a8e100466baf562e62abf3bfd2bcbe43526f6c4b0103aaaa8557e437c235a01e: Status 404 returned error can't find the container with id a8e100466baf562e62abf3bfd2bcbe43526f6c4b0103aaaa8557e437c235a01e Mar 19 20:17:58 crc kubenswrapper[4799]: I0319 20:17:58.294936 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d94bf8496-2rhdn"] Mar 19 20:17:58 crc kubenswrapper[4799]: W0319 20:17:58.302503 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf918f598_05f1_40b6_9627_4ba33ccf997d.slice/crio-60b84b3e9c8540a9c7a1453918abd63d3fdd85ade2359f002b09b6292e381c38 WatchSource:0}: Error finding container 60b84b3e9c8540a9c7a1453918abd63d3fdd85ade2359f002b09b6292e381c38: Status 404 returned error can't find the container with id 60b84b3e9c8540a9c7a1453918abd63d3fdd85ade2359f002b09b6292e381c38 Mar 19 20:17:59 crc kubenswrapper[4799]: I0319 20:17:59.073335 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d94bf8496-2rhdn" event={"ID":"f918f598-05f1-40b6-9627-4ba33ccf997d","Type":"ContainerStarted","Data":"c75e00ec7d2122a226a709b7d7b9f39a55bb58e4264a8414f3ecb271db3be812"} Mar 19 20:17:59 crc kubenswrapper[4799]: I0319 20:17:59.073696 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d94bf8496-2rhdn" event={"ID":"f918f598-05f1-40b6-9627-4ba33ccf997d","Type":"ContainerStarted","Data":"60b84b3e9c8540a9c7a1453918abd63d3fdd85ade2359f002b09b6292e381c38"} Mar 19 20:17:59 crc kubenswrapper[4799]: I0319 20:17:59.074653 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" event={"ID":"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6","Type":"ContainerStarted","Data":"a8e100466baf562e62abf3bfd2bcbe43526f6c4b0103aaaa8557e437c235a01e"} Mar 19 20:17:59 crc kubenswrapper[4799]: I0319 20:17:59.098885 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d94bf8496-2rhdn" podStartSLOduration=2.098855169 podStartE2EDuration="2.098855169s" podCreationTimestamp="2026-03-19 20:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:17:59.090715968 +0000 UTC m=+756.696669060" watchObservedRunningTime="2026-03-19 20:17:59.098855169 +0000 UTC m=+756.704808261" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.121689 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565858-mjzxf"] Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.123805 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.126400 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.126429 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.127067 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.130064 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-mjzxf"] Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.270563 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69tf5\" (UniqueName: \"kubernetes.io/projected/69e85fea-caad-40ee-b17e-e238858fa50f-kube-api-access-69tf5\") pod \"auto-csr-approver-29565858-mjzxf\" (UID: \"69e85fea-caad-40ee-b17e-e238858fa50f\") " pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.372539 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69tf5\" (UniqueName: \"kubernetes.io/projected/69e85fea-caad-40ee-b17e-e238858fa50f-kube-api-access-69tf5\") pod \"auto-csr-approver-29565858-mjzxf\" (UID: \"69e85fea-caad-40ee-b17e-e238858fa50f\") " pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.392467 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69tf5\" (UniqueName: \"kubernetes.io/projected/69e85fea-caad-40ee-b17e-e238858fa50f-kube-api-access-69tf5\") pod \"auto-csr-approver-29565858-mjzxf\" (UID: \"69e85fea-caad-40ee-b17e-e238858fa50f\") " pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:00 crc kubenswrapper[4799]: I0319 20:18:00.456446 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.005738 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-mjzxf"] Mar 19 20:18:01 crc kubenswrapper[4799]: W0319 20:18:01.014441 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e85fea_caad_40ee_b17e_e238858fa50f.slice/crio-4ca5a9300568b913b1a77d92f699f99f4248d15fd875d44243aa5a63918fe87f WatchSource:0}: Error finding container 4ca5a9300568b913b1a77d92f699f99f4248d15fd875d44243aa5a63918fe87f: Status 404 returned error can't find the container with id 4ca5a9300568b913b1a77d92f699f99f4248d15fd875d44243aa5a63918fe87f Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.097025 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" event={"ID":"dab69c67-b7fc-4f89-93c6-6ee825d89b7d","Type":"ContainerStarted","Data":"4db500045636ac515e3289072f1fc5fe59b31a0ac4d609853bf54e89302065ad"} Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.099032 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" event={"ID":"dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6","Type":"ContainerStarted","Data":"2904b3aac297d852267450ab9c9274a110369a8d7b2771ac40d30813103cbd43"} Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.099157 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.101336 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qqjcl" event={"ID":"7c8d90dd-d173-4fd7-a3ae-ed312bc20861","Type":"ContainerStarted","Data":"1745372ab14553cf9cbad1a8564363c00fc35b5591c4a00a861151c5357e7447"} Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.101435 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.102827 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" event={"ID":"69e85fea-caad-40ee-b17e-e238858fa50f","Type":"ContainerStarted","Data":"4ca5a9300568b913b1a77d92f699f99f4248d15fd875d44243aa5a63918fe87f"} Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.105665 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" event={"ID":"540372f2-ca9d-47b0-aaa6-86831627cd8e","Type":"ContainerStarted","Data":"d84563caef382a213fb18f75c6e72f0d9f0f2686d2875a8dacfddd063f8eaaac"} Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.139136 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" podStartSLOduration=1.762122889 podStartE2EDuration="4.13911189s" podCreationTimestamp="2026-03-19 20:17:57 +0000 UTC" firstStartedPulling="2026-03-19 20:17:58.265014442 +0000 UTC m=+755.870967534" lastFinishedPulling="2026-03-19 20:18:00.642003463 +0000 UTC m=+758.247956535" observedRunningTime="2026-03-19 20:18:01.12899377 +0000 UTC m=+758.734946862" watchObservedRunningTime="2026-03-19 20:18:01.13911189 +0000 UTC m=+758.745064982" Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.171055 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qqjcl" podStartSLOduration=1.049085181 podStartE2EDuration="4.171035s" podCreationTimestamp="2026-03-19 20:17:57 +0000 UTC" firstStartedPulling="2026-03-19 20:17:57.513648176 +0000 UTC m=+755.119601248" lastFinishedPulling="2026-03-19 20:18:00.635597995 +0000 UTC m=+758.241551067" observedRunningTime="2026-03-19 20:18:01.168741283 +0000 UTC m=+758.774694355" watchObservedRunningTime="2026-03-19 20:18:01.171035 +0000 UTC m=+758.776988072" Mar 19 20:18:01 crc kubenswrapper[4799]: I0319 20:18:01.187548 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9s6cx" podStartSLOduration=1.299139337 podStartE2EDuration="4.187532578s" podCreationTimestamp="2026-03-19 20:17:57 +0000 UTC" firstStartedPulling="2026-03-19 20:17:57.730545592 +0000 UTC m=+755.336498664" lastFinishedPulling="2026-03-19 20:18:00.618938833 +0000 UTC m=+758.224891905" observedRunningTime="2026-03-19 20:18:01.184639367 +0000 UTC m=+758.790592449" watchObservedRunningTime="2026-03-19 20:18:01.187532578 +0000 UTC m=+758.793485650" Mar 19 20:18:03 crc kubenswrapper[4799]: I0319 20:18:03.125891 4799 generic.go:334] "Generic (PLEG): container finished" podID="69e85fea-caad-40ee-b17e-e238858fa50f" containerID="3b17faeda0064e60c2e0c03ef3629515e8156ae2db55a7bd5ffefc54b881fffb" exitCode=0 Mar 19 20:18:03 crc kubenswrapper[4799]: I0319 20:18:03.126047 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" event={"ID":"69e85fea-caad-40ee-b17e-e238858fa50f","Type":"ContainerDied","Data":"3b17faeda0064e60c2e0c03ef3629515e8156ae2db55a7bd5ffefc54b881fffb"} Mar 19 20:18:04 crc kubenswrapper[4799]: I0319 20:18:04.142055 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" event={"ID":"dab69c67-b7fc-4f89-93c6-6ee825d89b7d","Type":"ContainerStarted","Data":"9c31580fc627fa29f60a06469d56dc44c40c10069b846653a841b8fdf28bc92a"} Mar 19 20:18:04 crc kubenswrapper[4799]: I0319 20:18:04.176403 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qzf64" podStartSLOduration=1.2512884340000001 podStartE2EDuration="7.176355694s" podCreationTimestamp="2026-03-19 20:17:57 +0000 UTC" firstStartedPulling="2026-03-19 20:17:57.643483798 +0000 UTC m=+755.249436870" lastFinishedPulling="2026-03-19 20:18:03.568551058 +0000 UTC m=+761.174504130" observedRunningTime="2026-03-19 20:18:04.174444487 +0000 UTC m=+761.780397589" watchObservedRunningTime="2026-03-19 20:18:04.176355694 +0000 UTC m=+761.782308816" Mar 19 20:18:04 crc kubenswrapper[4799]: I0319 20:18:04.462599 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:04 crc kubenswrapper[4799]: I0319 20:18:04.629939 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69tf5\" (UniqueName: \"kubernetes.io/projected/69e85fea-caad-40ee-b17e-e238858fa50f-kube-api-access-69tf5\") pod \"69e85fea-caad-40ee-b17e-e238858fa50f\" (UID: \"69e85fea-caad-40ee-b17e-e238858fa50f\") " Mar 19 20:18:04 crc kubenswrapper[4799]: I0319 20:18:04.639181 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e85fea-caad-40ee-b17e-e238858fa50f-kube-api-access-69tf5" (OuterVolumeSpecName: "kube-api-access-69tf5") pod "69e85fea-caad-40ee-b17e-e238858fa50f" (UID: "69e85fea-caad-40ee-b17e-e238858fa50f"). InnerVolumeSpecName "kube-api-access-69tf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:04 crc kubenswrapper[4799]: I0319 20:18:04.732237 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69tf5\" (UniqueName: \"kubernetes.io/projected/69e85fea-caad-40ee-b17e-e238858fa50f-kube-api-access-69tf5\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:05 crc kubenswrapper[4799]: I0319 20:18:05.151945 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" Mar 19 20:18:05 crc kubenswrapper[4799]: I0319 20:18:05.152351 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-mjzxf" event={"ID":"69e85fea-caad-40ee-b17e-e238858fa50f","Type":"ContainerDied","Data":"4ca5a9300568b913b1a77d92f699f99f4248d15fd875d44243aa5a63918fe87f"} Mar 19 20:18:05 crc kubenswrapper[4799]: I0319 20:18:05.152373 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca5a9300568b913b1a77d92f699f99f4248d15fd875d44243aa5a63918fe87f" Mar 19 20:18:05 crc kubenswrapper[4799]: I0319 20:18:05.567289 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-cqlgg"] Mar 19 20:18:05 crc kubenswrapper[4799]: I0319 20:18:05.575369 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-cqlgg"] Mar 19 20:18:06 crc kubenswrapper[4799]: I0319 20:18:06.371894 4799 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 20:18:07 crc kubenswrapper[4799]: I0319 20:18:07.130690 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c55479-c3ec-4e72-b12d-d287a8c82f42" path="/var/lib/kubelet/pods/79c55479-c3ec-4e72-b12d-d287a8c82f42/volumes" Mar 19 20:18:07 crc kubenswrapper[4799]: I0319 20:18:07.506115 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qqjcl" Mar 19 20:18:07 crc kubenswrapper[4799]: I0319 20:18:07.825857 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:18:07 crc kubenswrapper[4799]: I0319 20:18:07.825944 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:18:07 crc kubenswrapper[4799]: I0319 20:18:07.833824 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:18:08 crc kubenswrapper[4799]: I0319 20:18:08.177282 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d94bf8496-2rhdn" Mar 19 20:18:08 crc kubenswrapper[4799]: I0319 20:18:08.252923 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lq2lw"] Mar 19 20:18:18 crc kubenswrapper[4799]: I0319 20:18:18.017836 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-psznr" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.132645 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb"] Mar 19 20:18:33 crc kubenswrapper[4799]: E0319 20:18:33.133546 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e85fea-caad-40ee-b17e-e238858fa50f" containerName="oc" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.133566 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e85fea-caad-40ee-b17e-e238858fa50f" containerName="oc" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.133828 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e85fea-caad-40ee-b17e-e238858fa50f" containerName="oc" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.135236 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.139073 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.141410 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb"] Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.189022 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.189075 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbtm\" (UniqueName: \"kubernetes.io/projected/416e049b-dc1c-4119-b204-92e1e4f9513c-kube-api-access-wbbtm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.189163 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.290191 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.290281 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.290314 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbtm\" (UniqueName: \"kubernetes.io/projected/416e049b-dc1c-4119-b204-92e1e4f9513c-kube-api-access-wbbtm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.291193 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.291224 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.311542 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lq2lw" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerName="console" containerID="cri-o://7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2" gracePeriod=15 Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.324232 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbtm\" (UniqueName: \"kubernetes.io/projected/416e049b-dc1c-4119-b204-92e1e4f9513c-kube-api-access-wbbtm\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.459348 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.726953 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lq2lw_343a3122-a4be-4c67-bef4-22cd0e482cea/console/0.log" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.727011 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.898618 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-oauth-serving-cert\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.898695 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-service-ca\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.898767 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-oauth-config\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.898814 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2bxb\" (UniqueName: \"kubernetes.io/projected/343a3122-a4be-4c67-bef4-22cd0e482cea-kube-api-access-x2bxb\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.898843 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-trusted-ca-bundle\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.899019 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-console-config\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.900450 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.900502 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-serving-cert\") pod \"343a3122-a4be-4c67-bef4-22cd0e482cea\" (UID: \"343a3122-a4be-4c67-bef4-22cd0e482cea\") " Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.900479 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.900459 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-console-config" (OuterVolumeSpecName: "console-config") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.901091 4799 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.901126 4799 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.901148 4799 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.901301 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-service-ca" (OuterVolumeSpecName: "service-ca") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.905543 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343a3122-a4be-4c67-bef4-22cd0e482cea-kube-api-access-x2bxb" (OuterVolumeSpecName: "kube-api-access-x2bxb") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "kube-api-access-x2bxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.906081 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.908696 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "343a3122-a4be-4c67-bef4-22cd0e482cea" (UID: "343a3122-a4be-4c67-bef4-22cd0e482cea"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:18:33 crc kubenswrapper[4799]: I0319 20:18:33.940237 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb"] Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.002772 4799 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/343a3122-a4be-4c67-bef4-22cd0e482cea-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.002829 4799 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.002850 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2bxb\" (UniqueName: \"kubernetes.io/projected/343a3122-a4be-4c67-bef4-22cd0e482cea-kube-api-access-x2bxb\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.002871 4799 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/343a3122-a4be-4c67-bef4-22cd0e482cea-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.386170 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lq2lw_343a3122-a4be-4c67-bef4-22cd0e482cea/console/0.log" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.386694 4799 generic.go:334] "Generic (PLEG): container finished" podID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerID="7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2" exitCode=2 Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.386788 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lq2lw" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.386791 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lq2lw" event={"ID":"343a3122-a4be-4c67-bef4-22cd0e482cea","Type":"ContainerDied","Data":"7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2"} Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.387005 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lq2lw" event={"ID":"343a3122-a4be-4c67-bef4-22cd0e482cea","Type":"ContainerDied","Data":"165cff98bbda3f03191827287e0a4d93856840ad6c9c246a0c991af64fc902f8"} Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.387056 4799 scope.go:117] "RemoveContainer" containerID="7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.391671 4799 generic.go:334] "Generic (PLEG): container finished" podID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerID="025483ee00893902faae11c331ba3cbe852d5429e8532ba6d5d42b6c04f3ce9d" exitCode=0 Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.391747 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" event={"ID":"416e049b-dc1c-4119-b204-92e1e4f9513c","Type":"ContainerDied","Data":"025483ee00893902faae11c331ba3cbe852d5429e8532ba6d5d42b6c04f3ce9d"} Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.391793 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" event={"ID":"416e049b-dc1c-4119-b204-92e1e4f9513c","Type":"ContainerStarted","Data":"47441e1f2ee82c0cbc555eae2cb21c0a5e2c286d9df92ae6cbd3b00d2ec5906a"} Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.414739 4799 scope.go:117] "RemoveContainer" containerID="7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2" Mar 19 20:18:34 crc kubenswrapper[4799]: E0319 20:18:34.415921 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2\": container with ID starting with 7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2 not found: ID does not exist" containerID="7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.415974 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2"} err="failed to get container status \"7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2\": rpc error: code = NotFound desc = could not find container \"7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2\": container with ID starting with 7dc785439084961b264baa85da79120fac1ba3c5dc72a37f524e6751fd1626b2 not found: ID does not exist" Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.445912 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lq2lw"] Mar 19 20:18:34 crc kubenswrapper[4799]: I0319 20:18:34.453197 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lq2lw"] Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.132680 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" path="/var/lib/kubelet/pods/343a3122-a4be-4c67-bef4-22cd0e482cea/volumes" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.466718 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ldfjs"] Mar 19 20:18:35 crc kubenswrapper[4799]: E0319 20:18:35.467455 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerName="console" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.467491 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerName="console" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.467746 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="343a3122-a4be-4c67-bef4-22cd0e482cea" containerName="console" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.470524 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.481791 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldfjs"] Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.526370 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-utilities\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.526441 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-catalog-content\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.526577 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ngh\" (UniqueName: \"kubernetes.io/projected/e03b661a-874c-437e-ab2c-1ae759692012-kube-api-access-t5ngh\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.628298 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-catalog-content\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.628432 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ngh\" (UniqueName: \"kubernetes.io/projected/e03b661a-874c-437e-ab2c-1ae759692012-kube-api-access-t5ngh\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.628473 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-utilities\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.629224 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-catalog-content\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.629258 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-utilities\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.661275 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ngh\" (UniqueName: \"kubernetes.io/projected/e03b661a-874c-437e-ab2c-1ae759692012-kube-api-access-t5ngh\") pod \"redhat-operators-ldfjs\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:35 crc kubenswrapper[4799]: I0319 20:18:35.811905 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:36 crc kubenswrapper[4799]: I0319 20:18:36.289024 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldfjs"] Mar 19 20:18:36 crc kubenswrapper[4799]: I0319 20:18:36.411239 4799 generic.go:334] "Generic (PLEG): container finished" podID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerID="9163c71be36c187461a29920e4ddf08e38d7c0414f6bd92fbb011f041f831316" exitCode=0 Mar 19 20:18:36 crc kubenswrapper[4799]: I0319 20:18:36.411300 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" event={"ID":"416e049b-dc1c-4119-b204-92e1e4f9513c","Type":"ContainerDied","Data":"9163c71be36c187461a29920e4ddf08e38d7c0414f6bd92fbb011f041f831316"} Mar 19 20:18:36 crc kubenswrapper[4799]: I0319 20:18:36.412493 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerStarted","Data":"b2558306e1c1e42802f5a276d86ad3281cdad3243a3b517b46ef1946c12ca802"} Mar 19 20:18:37 crc kubenswrapper[4799]: I0319 20:18:37.422115 4799 generic.go:334] "Generic (PLEG): container finished" podID="e03b661a-874c-437e-ab2c-1ae759692012" containerID="5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0" exitCode=0 Mar 19 20:18:37 crc kubenswrapper[4799]: I0319 20:18:37.422239 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerDied","Data":"5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0"} Mar 19 20:18:37 crc kubenswrapper[4799]: I0319 20:18:37.426033 4799 generic.go:334] "Generic (PLEG): container finished" podID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerID="526b005c337e61719bd1974d8e90a5647483d3200a145933a496131d809f2fc7" exitCode=0 Mar 19 20:18:37 crc kubenswrapper[4799]: I0319 20:18:37.426080 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" event={"ID":"416e049b-dc1c-4119-b204-92e1e4f9513c","Type":"ContainerDied","Data":"526b005c337e61719bd1974d8e90a5647483d3200a145933a496131d809f2fc7"} Mar 19 20:18:37 crc kubenswrapper[4799]: I0319 20:18:37.922533 4799 scope.go:117] "RemoveContainer" containerID="39137657b9976e7dec982390994cc08419caf262a5c3ae241bb7f3d308122b6d" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.437033 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerStarted","Data":"a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45"} Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.729148 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.775868 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbtm\" (UniqueName: \"kubernetes.io/projected/416e049b-dc1c-4119-b204-92e1e4f9513c-kube-api-access-wbbtm\") pod \"416e049b-dc1c-4119-b204-92e1e4f9513c\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.775975 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-bundle\") pod \"416e049b-dc1c-4119-b204-92e1e4f9513c\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.776033 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-util\") pod \"416e049b-dc1c-4119-b204-92e1e4f9513c\" (UID: \"416e049b-dc1c-4119-b204-92e1e4f9513c\") " Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.777280 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-bundle" (OuterVolumeSpecName: "bundle") pod "416e049b-dc1c-4119-b204-92e1e4f9513c" (UID: "416e049b-dc1c-4119-b204-92e1e4f9513c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.787606 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416e049b-dc1c-4119-b204-92e1e4f9513c-kube-api-access-wbbtm" (OuterVolumeSpecName: "kube-api-access-wbbtm") pod "416e049b-dc1c-4119-b204-92e1e4f9513c" (UID: "416e049b-dc1c-4119-b204-92e1e4f9513c"). InnerVolumeSpecName "kube-api-access-wbbtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.796717 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-util" (OuterVolumeSpecName: "util") pod "416e049b-dc1c-4119-b204-92e1e4f9513c" (UID: "416e049b-dc1c-4119-b204-92e1e4f9513c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.877658 4799 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.877709 4799 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416e049b-dc1c-4119-b204-92e1e4f9513c-util\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:38 crc kubenswrapper[4799]: I0319 20:18:38.877731 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbtm\" (UniqueName: \"kubernetes.io/projected/416e049b-dc1c-4119-b204-92e1e4f9513c-kube-api-access-wbbtm\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:39 crc kubenswrapper[4799]: I0319 20:18:39.449064 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" Mar 19 20:18:39 crc kubenswrapper[4799]: I0319 20:18:39.449075 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb" event={"ID":"416e049b-dc1c-4119-b204-92e1e4f9513c","Type":"ContainerDied","Data":"47441e1f2ee82c0cbc555eae2cb21c0a5e2c286d9df92ae6cbd3b00d2ec5906a"} Mar 19 20:18:39 crc kubenswrapper[4799]: I0319 20:18:39.449151 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47441e1f2ee82c0cbc555eae2cb21c0a5e2c286d9df92ae6cbd3b00d2ec5906a" Mar 19 20:18:39 crc kubenswrapper[4799]: I0319 20:18:39.451264 4799 generic.go:334] "Generic (PLEG): container finished" podID="e03b661a-874c-437e-ab2c-1ae759692012" containerID="a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45" exitCode=0 Mar 19 20:18:39 crc kubenswrapper[4799]: I0319 20:18:39.451329 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerDied","Data":"a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45"} Mar 19 20:18:40 crc kubenswrapper[4799]: I0319 20:18:40.464223 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerStarted","Data":"be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1"} Mar 19 20:18:40 crc kubenswrapper[4799]: I0319 20:18:40.487512 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ldfjs" podStartSLOduration=2.810599097 podStartE2EDuration="5.487491027s" podCreationTimestamp="2026-03-19 20:18:35 +0000 UTC" firstStartedPulling="2026-03-19 20:18:37.426216408 +0000 UTC m=+795.032169520" lastFinishedPulling="2026-03-19 20:18:40.103108338 +0000 UTC m=+797.709061450" observedRunningTime="2026-03-19 20:18:40.486768299 +0000 UTC m=+798.092721391" watchObservedRunningTime="2026-03-19 20:18:40.487491027 +0000 UTC m=+798.093444109" Mar 19 20:18:45 crc kubenswrapper[4799]: I0319 20:18:45.812210 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:45 crc kubenswrapper[4799]: I0319 20:18:45.812544 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:46 crc kubenswrapper[4799]: I0319 20:18:46.873872 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ldfjs" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="registry-server" probeResult="failure" output=< Mar 19 20:18:46 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:18:46 crc kubenswrapper[4799]: > Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.582850 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9"] Mar 19 20:18:48 crc kubenswrapper[4799]: E0319 20:18:48.583055 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="util" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.583066 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="util" Mar 19 20:18:48 crc kubenswrapper[4799]: E0319 20:18:48.583085 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="extract" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.583090 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="extract" Mar 19 20:18:48 crc kubenswrapper[4799]: E0319 20:18:48.583101 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="pull" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.583118 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="pull" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.583213 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="416e049b-dc1c-4119-b204-92e1e4f9513c" containerName="extract" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.583567 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.585943 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.586279 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.586297 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-z9l5f" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.586283 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.586626 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.596907 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9"] Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.637796 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf9cd\" (UniqueName: \"kubernetes.io/projected/993c9a96-b852-40c4-87e6-02e706b89b25-kube-api-access-qf9cd\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.638032 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/993c9a96-b852-40c4-87e6-02e706b89b25-webhook-cert\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.638315 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/993c9a96-b852-40c4-87e6-02e706b89b25-apiservice-cert\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.740244 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/993c9a96-b852-40c4-87e6-02e706b89b25-webhook-cert\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.740328 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/993c9a96-b852-40c4-87e6-02e706b89b25-apiservice-cert\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.740404 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf9cd\" (UniqueName: \"kubernetes.io/projected/993c9a96-b852-40c4-87e6-02e706b89b25-kube-api-access-qf9cd\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.745717 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/993c9a96-b852-40c4-87e6-02e706b89b25-webhook-cert\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.745729 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/993c9a96-b852-40c4-87e6-02e706b89b25-apiservice-cert\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.756438 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf9cd\" (UniqueName: \"kubernetes.io/projected/993c9a96-b852-40c4-87e6-02e706b89b25-kube-api-access-qf9cd\") pod \"metallb-operator-controller-manager-5578d7df77-xlzz9\" (UID: \"993c9a96-b852-40c4-87e6-02e706b89b25\") " pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.824456 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h"] Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.825451 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.828548 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.829238 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.829906 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hvlds" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.846482 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h"] Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.940267 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.944926 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ec1faac-95e3-4189-bbfd-acc0f4662787-webhook-cert\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.945015 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ec1faac-95e3-4189-bbfd-acc0f4662787-apiservice-cert\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:48 crc kubenswrapper[4799]: I0319 20:18:48.945067 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhsdz\" (UniqueName: \"kubernetes.io/projected/0ec1faac-95e3-4189-bbfd-acc0f4662787-kube-api-access-lhsdz\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.046240 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ec1faac-95e3-4189-bbfd-acc0f4662787-apiservice-cert\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.046566 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhsdz\" (UniqueName: \"kubernetes.io/projected/0ec1faac-95e3-4189-bbfd-acc0f4662787-kube-api-access-lhsdz\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.046587 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ec1faac-95e3-4189-bbfd-acc0f4662787-webhook-cert\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.050105 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ec1faac-95e3-4189-bbfd-acc0f4662787-webhook-cert\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.060905 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ec1faac-95e3-4189-bbfd-acc0f4662787-apiservice-cert\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.075999 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhsdz\" (UniqueName: \"kubernetes.io/projected/0ec1faac-95e3-4189-bbfd-acc0f4662787-kube-api-access-lhsdz\") pod \"metallb-operator-webhook-server-59fdf54f4b-tp45h\" (UID: \"0ec1faac-95e3-4189-bbfd-acc0f4662787\") " pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.138500 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.185554 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9"] Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.359411 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h"] Mar 19 20:18:49 crc kubenswrapper[4799]: W0319 20:18:49.366783 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ec1faac_95e3_4189_bbfd_acc0f4662787.slice/crio-bbcc8e68199fbdb7a94fd24d28d170309cebc931d48a0624c89482fa5b7d390c WatchSource:0}: Error finding container bbcc8e68199fbdb7a94fd24d28d170309cebc931d48a0624c89482fa5b7d390c: Status 404 returned error can't find the container with id bbcc8e68199fbdb7a94fd24d28d170309cebc931d48a0624c89482fa5b7d390c Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.524273 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" event={"ID":"0ec1faac-95e3-4189-bbfd-acc0f4662787","Type":"ContainerStarted","Data":"bbcc8e68199fbdb7a94fd24d28d170309cebc931d48a0624c89482fa5b7d390c"} Mar 19 20:18:49 crc kubenswrapper[4799]: I0319 20:18:49.526168 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" event={"ID":"993c9a96-b852-40c4-87e6-02e706b89b25","Type":"ContainerStarted","Data":"aea66526f6bcf3e41933e8c44e73311c977c767085170275bdc3ff129288181e"} Mar 19 20:18:53 crc kubenswrapper[4799]: I0319 20:18:53.561369 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" event={"ID":"993c9a96-b852-40c4-87e6-02e706b89b25","Type":"ContainerStarted","Data":"5c2d00fde9e7be80e47841b8e3e5db4505b4ffc683d47dc8ed090ab325e3eb60"} Mar 19 20:18:53 crc kubenswrapper[4799]: I0319 20:18:53.562021 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:18:53 crc kubenswrapper[4799]: I0319 20:18:53.584445 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" podStartSLOduration=2.164031002 podStartE2EDuration="5.584430583s" podCreationTimestamp="2026-03-19 20:18:48 +0000 UTC" firstStartedPulling="2026-03-19 20:18:49.203467546 +0000 UTC m=+806.809420618" lastFinishedPulling="2026-03-19 20:18:52.623867127 +0000 UTC m=+810.229820199" observedRunningTime="2026-03-19 20:18:53.578128838 +0000 UTC m=+811.184081920" watchObservedRunningTime="2026-03-19 20:18:53.584430583 +0000 UTC m=+811.190383655" Mar 19 20:18:55 crc kubenswrapper[4799]: I0319 20:18:55.578105 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" event={"ID":"0ec1faac-95e3-4189-bbfd-acc0f4662787","Type":"ContainerStarted","Data":"6c108ba37b0ba6734ea214abefc1c1c061026525847c2d3b50de6c233f1aa6e1"} Mar 19 20:18:55 crc kubenswrapper[4799]: I0319 20:18:55.578623 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:18:55 crc kubenswrapper[4799]: I0319 20:18:55.615065 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" podStartSLOduration=2.06001925 podStartE2EDuration="7.615042144s" podCreationTimestamp="2026-03-19 20:18:48 +0000 UTC" firstStartedPulling="2026-03-19 20:18:49.369519464 +0000 UTC m=+806.975472536" lastFinishedPulling="2026-03-19 20:18:54.924542368 +0000 UTC m=+812.530495430" observedRunningTime="2026-03-19 20:18:55.613163279 +0000 UTC m=+813.219116391" watchObservedRunningTime="2026-03-19 20:18:55.615042144 +0000 UTC m=+813.220995256" Mar 19 20:18:55 crc kubenswrapper[4799]: I0319 20:18:55.868859 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:55 crc kubenswrapper[4799]: I0319 20:18:55.929544 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:56 crc kubenswrapper[4799]: I0319 20:18:56.642891 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldfjs"] Mar 19 20:18:57 crc kubenswrapper[4799]: I0319 20:18:57.591085 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ldfjs" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="registry-server" containerID="cri-o://be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1" gracePeriod=2 Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.061697 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.188153 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-utilities\") pod \"e03b661a-874c-437e-ab2c-1ae759692012\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.188284 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-catalog-content\") pod \"e03b661a-874c-437e-ab2c-1ae759692012\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.188333 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5ngh\" (UniqueName: \"kubernetes.io/projected/e03b661a-874c-437e-ab2c-1ae759692012-kube-api-access-t5ngh\") pod \"e03b661a-874c-437e-ab2c-1ae759692012\" (UID: \"e03b661a-874c-437e-ab2c-1ae759692012\") " Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.189339 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-utilities" (OuterVolumeSpecName: "utilities") pod "e03b661a-874c-437e-ab2c-1ae759692012" (UID: "e03b661a-874c-437e-ab2c-1ae759692012"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.195077 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03b661a-874c-437e-ab2c-1ae759692012-kube-api-access-t5ngh" (OuterVolumeSpecName: "kube-api-access-t5ngh") pod "e03b661a-874c-437e-ab2c-1ae759692012" (UID: "e03b661a-874c-437e-ab2c-1ae759692012"). InnerVolumeSpecName "kube-api-access-t5ngh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.290267 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5ngh\" (UniqueName: \"kubernetes.io/projected/e03b661a-874c-437e-ab2c-1ae759692012-kube-api-access-t5ngh\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.290303 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.343571 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e03b661a-874c-437e-ab2c-1ae759692012" (UID: "e03b661a-874c-437e-ab2c-1ae759692012"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.391468 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03b661a-874c-437e-ab2c-1ae759692012-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.596797 4799 generic.go:334] "Generic (PLEG): container finished" podID="e03b661a-874c-437e-ab2c-1ae759692012" containerID="be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1" exitCode=0 Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.596875 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldfjs" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.596886 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerDied","Data":"be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1"} Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.597223 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldfjs" event={"ID":"e03b661a-874c-437e-ab2c-1ae759692012","Type":"ContainerDied","Data":"b2558306e1c1e42802f5a276d86ad3281cdad3243a3b517b46ef1946c12ca802"} Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.597245 4799 scope.go:117] "RemoveContainer" containerID="be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.611138 4799 scope.go:117] "RemoveContainer" containerID="a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.624273 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldfjs"] Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.627665 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ldfjs"] Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.644755 4799 scope.go:117] "RemoveContainer" containerID="5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.669408 4799 scope.go:117] "RemoveContainer" containerID="be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1" Mar 19 20:18:58 crc kubenswrapper[4799]: E0319 20:18:58.669953 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1\": container with ID starting with be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1 not found: ID does not exist" containerID="be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.670004 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1"} err="failed to get container status \"be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1\": rpc error: code = NotFound desc = could not find container \"be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1\": container with ID starting with be4fbf2dd5fc35572ffdc254f10cf94ba21ce8786d7604974c297c697cd500c1 not found: ID does not exist" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.670033 4799 scope.go:117] "RemoveContainer" containerID="a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45" Mar 19 20:18:58 crc kubenswrapper[4799]: E0319 20:18:58.672214 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45\": container with ID starting with a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45 not found: ID does not exist" containerID="a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.672246 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45"} err="failed to get container status \"a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45\": rpc error: code = NotFound desc = could not find container \"a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45\": container with ID starting with a4ca3acf1cd8062428b541231f76dcdcde4ceb2871b22646f43aec46e601ae45 not found: ID does not exist" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.672282 4799 scope.go:117] "RemoveContainer" containerID="5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0" Mar 19 20:18:58 crc kubenswrapper[4799]: E0319 20:18:58.672663 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0\": container with ID starting with 5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0 not found: ID does not exist" containerID="5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0" Mar 19 20:18:58 crc kubenswrapper[4799]: I0319 20:18:58.672684 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0"} err="failed to get container status \"5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0\": rpc error: code = NotFound desc = could not find container \"5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0\": container with ID starting with 5eb3adca9ebe9255421bf11aea78cbad6dca18f2c3b113957d28c5042cbcdbe0 not found: ID does not exist" Mar 19 20:18:59 crc kubenswrapper[4799]: I0319 20:18:59.129865 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03b661a-874c-437e-ab2c-1ae759692012" path="/var/lib/kubelet/pods/e03b661a-874c-437e-ab2c-1ae759692012/volumes" Mar 19 20:19:09 crc kubenswrapper[4799]: I0319 20:19:09.145936 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-59fdf54f4b-tp45h" Mar 19 20:19:28 crc kubenswrapper[4799]: I0319 20:19:28.944430 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5578d7df77-xlzz9" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.662565 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-m52p2"] Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.662880 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="registry-server" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.662909 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="registry-server" Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.662930 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="extract-utilities" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.662941 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="extract-utilities" Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.662961 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="extract-content" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.662972 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="extract-content" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.663159 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03b661a-874c-437e-ab2c-1ae759692012" containerName="registry-server" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.666175 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.667588 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v"] Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.668287 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.669919 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.670165 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fbt2z" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.670460 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.674843 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.676286 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v"] Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.740659 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-m865t"] Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.741531 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.743303 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.743394 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.743442 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wszn6" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.745982 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.753057 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-9nd8c"] Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.753898 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.756456 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.772906 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-9nd8c"] Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.777254 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-frr-conf\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.777314 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.777344 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66cf30af-75f2-49da-a9de-cb266154b446-frr-startup\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.777434 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-reloader\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.777461 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zh7l\" (UniqueName: \"kubernetes.io/projected/66cf30af-75f2-49da-a9de-cb266154b446-kube-api-access-7zh7l\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.777586 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-frr-sockets\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.778020 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-metrics\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.778063 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66cf30af-75f2-49da-a9de-cb266154b446-metrics-certs\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.778118 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gprww\" (UniqueName: \"kubernetes.io/projected/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-kube-api-access-gprww\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880072 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66cf30af-75f2-49da-a9de-cb266154b446-frr-startup\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880147 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-reloader\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880173 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zh7l\" (UniqueName: \"kubernetes.io/projected/66cf30af-75f2-49da-a9de-cb266154b446-kube-api-access-7zh7l\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880202 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-metrics-certs\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880229 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880252 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f71d2873-ac06-4cca-b70c-162e283e23b8-metallb-excludel2\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880321 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-frr-sockets\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880346 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-metrics\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880394 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66cf30af-75f2-49da-a9de-cb266154b446-metrics-certs\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880417 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-metrics-certs\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880440 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6rw\" (UniqueName: \"kubernetes.io/projected/f71d2873-ac06-4cca-b70c-162e283e23b8-kube-api-access-5x6rw\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880458 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gprww\" (UniqueName: \"kubernetes.io/projected/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-kube-api-access-gprww\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880488 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-cert\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880508 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qj7c\" (UniqueName: \"kubernetes.io/projected/8a6fd137-8e20-4043-b746-7d4b884ffc5a-kube-api-access-2qj7c\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880529 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-frr-conf\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880546 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.880625 4799 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.880669 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-cert podName:b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a nodeName:}" failed. No retries permitted until 2026-03-19 20:19:30.380654613 +0000 UTC m=+847.986607685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-cert") pod "frr-k8s-webhook-server-bcc4b6f68-2gv7v" (UID: "b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a") : secret "frr-k8s-webhook-server-cert" not found Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880740 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-frr-sockets\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880737 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-reloader\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.880866 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-metrics\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.881042 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66cf30af-75f2-49da-a9de-cb266154b446-frr-conf\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.881071 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66cf30af-75f2-49da-a9de-cb266154b446-frr-startup\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.888011 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66cf30af-75f2-49da-a9de-cb266154b446-metrics-certs\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.898036 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gprww\" (UniqueName: \"kubernetes.io/projected/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-kube-api-access-gprww\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.898079 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zh7l\" (UniqueName: \"kubernetes.io/projected/66cf30af-75f2-49da-a9de-cb266154b446-kube-api-access-7zh7l\") pod \"frr-k8s-m52p2\" (UID: \"66cf30af-75f2-49da-a9de-cb266154b446\") " pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.981790 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-metrics-certs\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.982108 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.982124 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f71d2873-ac06-4cca-b70c-162e283e23b8-metallb-excludel2\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.982159 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-metrics-certs\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.982183 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6rw\" (UniqueName: \"kubernetes.io/projected/f71d2873-ac06-4cca-b70c-162e283e23b8-kube-api-access-5x6rw\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.981949 4799 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.982202 4799 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.982214 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-cert\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.982268 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist podName:f71d2873-ac06-4cca-b70c-162e283e23b8 nodeName:}" failed. No retries permitted until 2026-03-19 20:19:30.482250581 +0000 UTC m=+848.088203653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist") pod "speaker-m865t" (UID: "f71d2873-ac06-4cca-b70c-162e283e23b8") : secret "metallb-memberlist" not found Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.982283 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-metrics-certs podName:8a6fd137-8e20-4043-b746-7d4b884ffc5a nodeName:}" failed. No retries permitted until 2026-03-19 20:19:30.482276572 +0000 UTC m=+848.088229644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-metrics-certs") pod "controller-7bb4cc7c98-9nd8c" (UID: "8a6fd137-8e20-4043-b746-7d4b884ffc5a") : secret "controller-certs-secret" not found Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.982327 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qj7c\" (UniqueName: \"kubernetes.io/projected/8a6fd137-8e20-4043-b746-7d4b884ffc5a-kube-api-access-2qj7c\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.982368 4799 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 19 20:19:29 crc kubenswrapper[4799]: E0319 20:19:29.982438 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-metrics-certs podName:f71d2873-ac06-4cca-b70c-162e283e23b8 nodeName:}" failed. No retries permitted until 2026-03-19 20:19:30.482416355 +0000 UTC m=+848.088369627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-metrics-certs") pod "speaker-m865t" (UID: "f71d2873-ac06-4cca-b70c-162e283e23b8") : secret "speaker-certs-secret" not found Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.983027 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f71d2873-ac06-4cca-b70c-162e283e23b8-metallb-excludel2\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.986310 4799 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.986352 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:29 crc kubenswrapper[4799]: I0319 20:19:29.996773 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-cert\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.004629 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6rw\" (UniqueName: \"kubernetes.io/projected/f71d2873-ac06-4cca-b70c-162e283e23b8-kube-api-access-5x6rw\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.009041 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qj7c\" (UniqueName: \"kubernetes.io/projected/8a6fd137-8e20-4043-b746-7d4b884ffc5a-kube-api-access-2qj7c\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.132919 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.387754 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.392118 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-2gv7v\" (UID: \"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.489306 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-metrics-certs\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.489448 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.489558 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-metrics-certs\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:30 crc kubenswrapper[4799]: E0319 20:19:30.489639 4799 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 20:19:30 crc kubenswrapper[4799]: E0319 20:19:30.489741 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist podName:f71d2873-ac06-4cca-b70c-162e283e23b8 nodeName:}" failed. No retries permitted until 2026-03-19 20:19:31.489715295 +0000 UTC m=+849.095668407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist") pod "speaker-m865t" (UID: "f71d2873-ac06-4cca-b70c-162e283e23b8") : secret "metallb-memberlist" not found Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.498687 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a6fd137-8e20-4043-b746-7d4b884ffc5a-metrics-certs\") pod \"controller-7bb4cc7c98-9nd8c\" (UID: \"8a6fd137-8e20-4043-b746-7d4b884ffc5a\") " pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.501065 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-metrics-certs\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.593882 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.665845 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.817805 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"def8b31448fa42360b0adf5b0acba0bd382d510b146ba2584d471a05a8486721"} Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.868820 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v"] Mar 19 20:19:30 crc kubenswrapper[4799]: W0319 20:19:30.873526 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5cbaa72_f84a_4672_bb65_f67e4cf5ac5a.slice/crio-1a606af4c6c74d6c74785beb683c516e4bb84ce457afabb4e4c7cee748e9dc0c WatchSource:0}: Error finding container 1a606af4c6c74d6c74785beb683c516e4bb84ce457afabb4e4c7cee748e9dc0c: Status 404 returned error can't find the container with id 1a606af4c6c74d6c74785beb683c516e4bb84ce457afabb4e4c7cee748e9dc0c Mar 19 20:19:30 crc kubenswrapper[4799]: I0319 20:19:30.922880 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-9nd8c"] Mar 19 20:19:30 crc kubenswrapper[4799]: W0319 20:19:30.929628 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a6fd137_8e20_4043_b746_7d4b884ffc5a.slice/crio-bcc76decf103584a3a8d45ae5aeeed9432352f671432ba40fe0b9992a0f41de5 WatchSource:0}: Error finding container bcc76decf103584a3a8d45ae5aeeed9432352f671432ba40fe0b9992a0f41de5: Status 404 returned error can't find the container with id bcc76decf103584a3a8d45ae5aeeed9432352f671432ba40fe0b9992a0f41de5 Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.506573 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.516953 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f71d2873-ac06-4cca-b70c-162e283e23b8-memberlist\") pod \"speaker-m865t\" (UID: \"f71d2873-ac06-4cca-b70c-162e283e23b8\") " pod="metallb-system/speaker-m865t" Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.560101 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m865t" Mar 19 20:19:31 crc kubenswrapper[4799]: W0319 20:19:31.588786 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf71d2873_ac06_4cca_b70c_162e283e23b8.slice/crio-af6515bf107d193b85147b630865e17924892edc263fb36d73cd986f7cc0d719 WatchSource:0}: Error finding container af6515bf107d193b85147b630865e17924892edc263fb36d73cd986f7cc0d719: Status 404 returned error can't find the container with id af6515bf107d193b85147b630865e17924892edc263fb36d73cd986f7cc0d719 Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.835350 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" event={"ID":"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a","Type":"ContainerStarted","Data":"1a606af4c6c74d6c74785beb683c516e4bb84ce457afabb4e4c7cee748e9dc0c"} Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.847366 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-9nd8c" event={"ID":"8a6fd137-8e20-4043-b746-7d4b884ffc5a","Type":"ContainerStarted","Data":"9faf1286f84e726e01307f7e24c300f8d006cb5a4aa83eaeabf5c3a750150efa"} Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.847451 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-9nd8c" event={"ID":"8a6fd137-8e20-4043-b746-7d4b884ffc5a","Type":"ContainerStarted","Data":"3231f4ff5c70364c11ecc64e3bd3b737d4f4f94a68d38ce33a463666ac06dbfb"} Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.847464 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-9nd8c" event={"ID":"8a6fd137-8e20-4043-b746-7d4b884ffc5a","Type":"ContainerStarted","Data":"bcc76decf103584a3a8d45ae5aeeed9432352f671432ba40fe0b9992a0f41de5"} Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.847724 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.855203 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m865t" event={"ID":"f71d2873-ac06-4cca-b70c-162e283e23b8","Type":"ContainerStarted","Data":"af6515bf107d193b85147b630865e17924892edc263fb36d73cd986f7cc0d719"} Mar 19 20:19:31 crc kubenswrapper[4799]: I0319 20:19:31.867243 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-9nd8c" podStartSLOduration=2.867227359 podStartE2EDuration="2.867227359s" podCreationTimestamp="2026-03-19 20:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:19:31.865269199 +0000 UTC m=+849.471222271" watchObservedRunningTime="2026-03-19 20:19:31.867227359 +0000 UTC m=+849.473180421" Mar 19 20:19:32 crc kubenswrapper[4799]: I0319 20:19:32.863899 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m865t" event={"ID":"f71d2873-ac06-4cca-b70c-162e283e23b8","Type":"ContainerStarted","Data":"e9eb38b4b3c35b507db0ec2c0442430149c8b195aa429f5211715b3c8b742ad3"} Mar 19 20:19:32 crc kubenswrapper[4799]: I0319 20:19:32.864193 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m865t" event={"ID":"f71d2873-ac06-4cca-b70c-162e283e23b8","Type":"ContainerStarted","Data":"dad52160dd453e32339697642a0be08d710d6ab1b0cac12484028a3f9a8888ba"} Mar 19 20:19:33 crc kubenswrapper[4799]: I0319 20:19:33.141914 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-m865t" podStartSLOduration=4.141896905 podStartE2EDuration="4.141896905s" podCreationTimestamp="2026-03-19 20:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:19:32.878571175 +0000 UTC m=+850.484524247" watchObservedRunningTime="2026-03-19 20:19:33.141896905 +0000 UTC m=+850.747849977" Mar 19 20:19:33 crc kubenswrapper[4799]: I0319 20:19:33.869519 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-m865t" Mar 19 20:19:37 crc kubenswrapper[4799]: I0319 20:19:37.921538 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" event={"ID":"b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a","Type":"ContainerStarted","Data":"088e6f053cbedfbad594b20a4266e792dc92c3a2c32ec520806d94238f97973e"} Mar 19 20:19:37 crc kubenswrapper[4799]: I0319 20:19:37.921929 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:37 crc kubenswrapper[4799]: I0319 20:19:37.928299 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerDied","Data":"6f4be3e86c4c6ec9b4d67ba58add6d89f1337f66f8fcaaed43d60a1a64743e0d"} Mar 19 20:19:37 crc kubenswrapper[4799]: I0319 20:19:37.927929 4799 generic.go:334] "Generic (PLEG): container finished" podID="66cf30af-75f2-49da-a9de-cb266154b446" containerID="6f4be3e86c4c6ec9b4d67ba58add6d89f1337f66f8fcaaed43d60a1a64743e0d" exitCode=0 Mar 19 20:19:37 crc kubenswrapper[4799]: I0319 20:19:37.973108 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" podStartSLOduration=2.160952014 podStartE2EDuration="8.973090898s" podCreationTimestamp="2026-03-19 20:19:29 +0000 UTC" firstStartedPulling="2026-03-19 20:19:30.875535533 +0000 UTC m=+848.481488605" lastFinishedPulling="2026-03-19 20:19:37.687674417 +0000 UTC m=+855.293627489" observedRunningTime="2026-03-19 20:19:37.939091059 +0000 UTC m=+855.545044131" watchObservedRunningTime="2026-03-19 20:19:37.973090898 +0000 UTC m=+855.579043970" Mar 19 20:19:38 crc kubenswrapper[4799]: I0319 20:19:38.940488 4799 generic.go:334] "Generic (PLEG): container finished" podID="66cf30af-75f2-49da-a9de-cb266154b446" containerID="40686007dcfee266db1ba8bc524d17210c8a15bb8ca99f2e550c0620b90bae46" exitCode=0 Mar 19 20:19:38 crc kubenswrapper[4799]: I0319 20:19:38.940605 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerDied","Data":"40686007dcfee266db1ba8bc524d17210c8a15bb8ca99f2e550c0620b90bae46"} Mar 19 20:19:39 crc kubenswrapper[4799]: I0319 20:19:39.950197 4799 generic.go:334] "Generic (PLEG): container finished" podID="66cf30af-75f2-49da-a9de-cb266154b446" containerID="7915e6e23588c1b4551df7936be5fba36c5897f183cd42704b32bdcd41d3e310" exitCode=0 Mar 19 20:19:39 crc kubenswrapper[4799]: I0319 20:19:39.950300 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerDied","Data":"7915e6e23588c1b4551df7936be5fba36c5897f183cd42704b32bdcd41d3e310"} Mar 19 20:19:40 crc kubenswrapper[4799]: I0319 20:19:40.968136 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"4caf7066d50e1aa05af2cd841e81c1c2a2d5f53354c5295d82c990e48191bf50"} Mar 19 20:19:40 crc kubenswrapper[4799]: I0319 20:19:40.968431 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"d73ab04210259b395e63d9187f19273c26f29581c1560982e57f0f591fe35fb9"} Mar 19 20:19:40 crc kubenswrapper[4799]: I0319 20:19:40.968444 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"e08db40a991d2128795f22c4e282b27b7e483886c15bee478f605940915e8e20"} Mar 19 20:19:40 crc kubenswrapper[4799]: I0319 20:19:40.968456 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"8a9f364d93a53f9b6319aafc0072572de585b7f6540a4b5cf6c8267c218fc829"} Mar 19 20:19:40 crc kubenswrapper[4799]: I0319 20:19:40.968467 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"143c9d61333ecd502f047736e23da9c50f0eeac3d7906f715902b4aca9cd2b27"} Mar 19 20:19:41 crc kubenswrapper[4799]: I0319 20:19:41.565451 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-m865t" Mar 19 20:19:41 crc kubenswrapper[4799]: I0319 20:19:41.983379 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m52p2" event={"ID":"66cf30af-75f2-49da-a9de-cb266154b446","Type":"ContainerStarted","Data":"a0089bb3a7e71ac73e17dfd53da7af1dca7c169a54422c43ef98529a32fcf293"} Mar 19 20:19:41 crc kubenswrapper[4799]: I0319 20:19:41.984647 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:42 crc kubenswrapper[4799]: I0319 20:19:42.023467 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-m52p2" podStartSLOduration=5.501484275 podStartE2EDuration="13.023433666s" podCreationTimestamp="2026-03-19 20:19:29 +0000 UTC" firstStartedPulling="2026-03-19 20:19:30.132663151 +0000 UTC m=+847.738616223" lastFinishedPulling="2026-03-19 20:19:37.654612542 +0000 UTC m=+855.260565614" observedRunningTime="2026-03-19 20:19:42.021920727 +0000 UTC m=+859.627873829" watchObservedRunningTime="2026-03-19 20:19:42.023433666 +0000 UTC m=+859.629386778" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.262428 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c67tv"] Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.266051 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.271009 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.271116 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.271427 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wb4qj" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.283731 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c67tv"] Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.306427 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnkv2\" (UniqueName: \"kubernetes.io/projected/13276c16-34d1-424d-97fd-cf28e5f3c14d-kube-api-access-fnkv2\") pod \"openstack-operator-index-c67tv\" (UID: \"13276c16-34d1-424d-97fd-cf28e5f3c14d\") " pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.407250 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnkv2\" (UniqueName: \"kubernetes.io/projected/13276c16-34d1-424d-97fd-cf28e5f3c14d-kube-api-access-fnkv2\") pod \"openstack-operator-index-c67tv\" (UID: \"13276c16-34d1-424d-97fd-cf28e5f3c14d\") " pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.429215 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnkv2\" (UniqueName: \"kubernetes.io/projected/13276c16-34d1-424d-97fd-cf28e5f3c14d-kube-api-access-fnkv2\") pod \"openstack-operator-index-c67tv\" (UID: \"13276c16-34d1-424d-97fd-cf28e5f3c14d\") " pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.592952 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:44 crc kubenswrapper[4799]: I0319 20:19:44.987159 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:45 crc kubenswrapper[4799]: I0319 20:19:45.026632 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-m52p2" Mar 19 20:19:45 crc kubenswrapper[4799]: I0319 20:19:45.144143 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c67tv"] Mar 19 20:19:46 crc kubenswrapper[4799]: I0319 20:19:46.028100 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c67tv" event={"ID":"13276c16-34d1-424d-97fd-cf28e5f3c14d","Type":"ContainerStarted","Data":"b45fb6fd802629aaf11303511f61ce59b944717a4f050d7e3012bc90bc0f25df"} Mar 19 20:19:47 crc kubenswrapper[4799]: I0319 20:19:47.036973 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c67tv" event={"ID":"13276c16-34d1-424d-97fd-cf28e5f3c14d","Type":"ContainerStarted","Data":"eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d"} Mar 19 20:19:47 crc kubenswrapper[4799]: I0319 20:19:47.066991 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c67tv" podStartSLOduration=2.190380601 podStartE2EDuration="3.066963621s" podCreationTimestamp="2026-03-19 20:19:44 +0000 UTC" firstStartedPulling="2026-03-19 20:19:45.149774659 +0000 UTC m=+862.755727741" lastFinishedPulling="2026-03-19 20:19:46.026357659 +0000 UTC m=+863.632310761" observedRunningTime="2026-03-19 20:19:47.058457941 +0000 UTC m=+864.664411043" watchObservedRunningTime="2026-03-19 20:19:47.066963621 +0000 UTC m=+864.672916723" Mar 19 20:19:47 crc kubenswrapper[4799]: I0319 20:19:47.622476 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-c67tv"] Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.232206 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zw5gb"] Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.234507 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.246743 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zw5gb"] Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.364755 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbc9\" (UniqueName: \"kubernetes.io/projected/d4a4d93a-af9b-49a6-8786-34f07a5a4ba4-kube-api-access-9tbc9\") pod \"openstack-operator-index-zw5gb\" (UID: \"d4a4d93a-af9b-49a6-8786-34f07a5a4ba4\") " pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.465967 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbc9\" (UniqueName: \"kubernetes.io/projected/d4a4d93a-af9b-49a6-8786-34f07a5a4ba4-kube-api-access-9tbc9\") pod \"openstack-operator-index-zw5gb\" (UID: \"d4a4d93a-af9b-49a6-8786-34f07a5a4ba4\") " pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.498473 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbc9\" (UniqueName: \"kubernetes.io/projected/d4a4d93a-af9b-49a6-8786-34f07a5a4ba4-kube-api-access-9tbc9\") pod \"openstack-operator-index-zw5gb\" (UID: \"d4a4d93a-af9b-49a6-8786-34f07a5a4ba4\") " pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:48 crc kubenswrapper[4799]: I0319 20:19:48.578024 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:49 crc kubenswrapper[4799]: I0319 20:19:49.053116 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-c67tv" podUID="13276c16-34d1-424d-97fd-cf28e5f3c14d" containerName="registry-server" containerID="cri-o://eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d" gracePeriod=2 Mar 19 20:19:49 crc kubenswrapper[4799]: I0319 20:19:49.104570 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zw5gb"] Mar 19 20:19:49 crc kubenswrapper[4799]: W0319 20:19:49.176710 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a4d93a_af9b_49a6_8786_34f07a5a4ba4.slice/crio-7010eeeb28f83618cf8d58d1431aa381a56d38d872dab0abbe1a42c19999bf69 WatchSource:0}: Error finding container 7010eeeb28f83618cf8d58d1431aa381a56d38d872dab0abbe1a42c19999bf69: Status 404 returned error can't find the container with id 7010eeeb28f83618cf8d58d1431aa381a56d38d872dab0abbe1a42c19999bf69 Mar 19 20:19:49 crc kubenswrapper[4799]: I0319 20:19:49.464802 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:49 crc kubenswrapper[4799]: I0319 20:19:49.581477 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnkv2\" (UniqueName: \"kubernetes.io/projected/13276c16-34d1-424d-97fd-cf28e5f3c14d-kube-api-access-fnkv2\") pod \"13276c16-34d1-424d-97fd-cf28e5f3c14d\" (UID: \"13276c16-34d1-424d-97fd-cf28e5f3c14d\") " Mar 19 20:19:49 crc kubenswrapper[4799]: I0319 20:19:49.590693 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13276c16-34d1-424d-97fd-cf28e5f3c14d-kube-api-access-fnkv2" (OuterVolumeSpecName: "kube-api-access-fnkv2") pod "13276c16-34d1-424d-97fd-cf28e5f3c14d" (UID: "13276c16-34d1-424d-97fd-cf28e5f3c14d"). InnerVolumeSpecName "kube-api-access-fnkv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:19:49 crc kubenswrapper[4799]: I0319 20:19:49.683933 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnkv2\" (UniqueName: \"kubernetes.io/projected/13276c16-34d1-424d-97fd-cf28e5f3c14d-kube-api-access-fnkv2\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.063277 4799 generic.go:334] "Generic (PLEG): container finished" podID="13276c16-34d1-424d-97fd-cf28e5f3c14d" containerID="eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d" exitCode=0 Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.063336 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c67tv" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.063349 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c67tv" event={"ID":"13276c16-34d1-424d-97fd-cf28e5f3c14d","Type":"ContainerDied","Data":"eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d"} Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.063507 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c67tv" event={"ID":"13276c16-34d1-424d-97fd-cf28e5f3c14d","Type":"ContainerDied","Data":"b45fb6fd802629aaf11303511f61ce59b944717a4f050d7e3012bc90bc0f25df"} Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.063547 4799 scope.go:117] "RemoveContainer" containerID="eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.066089 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zw5gb" event={"ID":"d4a4d93a-af9b-49a6-8786-34f07a5a4ba4","Type":"ContainerStarted","Data":"1abc215bc5987dd011f21fa619bbb0a9ab662ec29b657417c16c5f107e898493"} Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.066156 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zw5gb" event={"ID":"d4a4d93a-af9b-49a6-8786-34f07a5a4ba4","Type":"ContainerStarted","Data":"7010eeeb28f83618cf8d58d1431aa381a56d38d872dab0abbe1a42c19999bf69"} Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.097960 4799 scope.go:117] "RemoveContainer" containerID="eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d" Mar 19 20:19:50 crc kubenswrapper[4799]: E0319 20:19:50.098627 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d\": container with ID starting with eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d not found: ID does not exist" containerID="eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.098704 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d"} err="failed to get container status \"eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d\": rpc error: code = NotFound desc = could not find container \"eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d\": container with ID starting with eb9ce98af3c95bfba1143dcf8cb8f2e724c539559dc5436a5704c070068f450d not found: ID does not exist" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.099855 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zw5gb" podStartSLOduration=1.579483838 podStartE2EDuration="2.099825085s" podCreationTimestamp="2026-03-19 20:19:48 +0000 UTC" firstStartedPulling="2026-03-19 20:19:49.184241376 +0000 UTC m=+866.790194488" lastFinishedPulling="2026-03-19 20:19:49.704582623 +0000 UTC m=+867.310535735" observedRunningTime="2026-03-19 20:19:50.094782105 +0000 UTC m=+867.700735217" watchObservedRunningTime="2026-03-19 20:19:50.099825085 +0000 UTC m=+867.705778197" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.120053 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-c67tv"] Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.128472 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-c67tv"] Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.606553 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-2gv7v" Mar 19 20:19:50 crc kubenswrapper[4799]: I0319 20:19:50.676679 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-9nd8c" Mar 19 20:19:51 crc kubenswrapper[4799]: I0319 20:19:51.127927 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13276c16-34d1-424d-97fd-cf28e5f3c14d" path="/var/lib/kubelet/pods/13276c16-34d1-424d-97fd-cf28e5f3c14d/volumes" Mar 19 20:19:58 crc kubenswrapper[4799]: I0319 20:19:58.579123 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:58 crc kubenswrapper[4799]: I0319 20:19:58.579783 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:58 crc kubenswrapper[4799]: I0319 20:19:58.631779 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:58 crc kubenswrapper[4799]: I0319 20:19:58.756105 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:19:58 crc kubenswrapper[4799]: I0319 20:19:58.756190 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.175334 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zw5gb" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.920775 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq"] Mar 19 20:19:59 crc kubenswrapper[4799]: E0319 20:19:59.921309 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13276c16-34d1-424d-97fd-cf28e5f3c14d" containerName="registry-server" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.921329 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="13276c16-34d1-424d-97fd-cf28e5f3c14d" containerName="registry-server" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.932846 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="13276c16-34d1-424d-97fd-cf28e5f3c14d" containerName="registry-server" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.934288 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.938001 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xjwb8" Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.940131 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq"] Mar 19 20:19:59 crc kubenswrapper[4799]: I0319 20:19:59.989177 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-m52p2" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.040358 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.040511 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsrgb\" (UniqueName: \"kubernetes.io/projected/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-kube-api-access-lsrgb\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.040909 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.128197 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565860-pf9sc"] Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.129060 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.135577 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-pf9sc"] Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.136216 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.136496 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.136572 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.141998 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.142162 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.142231 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsrgb\" (UniqueName: \"kubernetes.io/projected/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-kube-api-access-lsrgb\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.142683 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.142736 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.176362 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsrgb\" (UniqueName: \"kubernetes.io/projected/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-kube-api-access-lsrgb\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.243695 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xq6s\" (UniqueName: \"kubernetes.io/projected/2fed39f2-bbdc-492f-be94-cdde1c1798ed-kube-api-access-7xq6s\") pod \"auto-csr-approver-29565860-pf9sc\" (UID: \"2fed39f2-bbdc-492f-be94-cdde1c1798ed\") " pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.255457 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.344885 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xq6s\" (UniqueName: \"kubernetes.io/projected/2fed39f2-bbdc-492f-be94-cdde1c1798ed-kube-api-access-7xq6s\") pod \"auto-csr-approver-29565860-pf9sc\" (UID: \"2fed39f2-bbdc-492f-be94-cdde1c1798ed\") " pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.392090 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xq6s\" (UniqueName: \"kubernetes.io/projected/2fed39f2-bbdc-492f-be94-cdde1c1798ed-kube-api-access-7xq6s\") pod \"auto-csr-approver-29565860-pf9sc\" (UID: \"2fed39f2-bbdc-492f-be94-cdde1c1798ed\") " pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.446811 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:00 crc kubenswrapper[4799]: I0319 20:20:00.571097 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq"] Mar 19 20:20:01 crc kubenswrapper[4799]: I0319 20:20:00.943059 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-pf9sc"] Mar 19 20:20:01 crc kubenswrapper[4799]: W0319 20:20:00.947174 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fed39f2_bbdc_492f_be94_cdde1c1798ed.slice/crio-2346ede970686ed1907fc4ef4e90b265b93ed95522179ec1eed3d16aef7642df WatchSource:0}: Error finding container 2346ede970686ed1907fc4ef4e90b265b93ed95522179ec1eed3d16aef7642df: Status 404 returned error can't find the container with id 2346ede970686ed1907fc4ef4e90b265b93ed95522179ec1eed3d16aef7642df Mar 19 20:20:01 crc kubenswrapper[4799]: I0319 20:20:01.161303 4799 generic.go:334] "Generic (PLEG): container finished" podID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerID="ee170301d8d910f6791424f3da9e3b093594da83ca9017c12001d69a1b2ed69d" exitCode=0 Mar 19 20:20:01 crc kubenswrapper[4799]: I0319 20:20:01.161454 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" event={"ID":"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3","Type":"ContainerDied","Data":"ee170301d8d910f6791424f3da9e3b093594da83ca9017c12001d69a1b2ed69d"} Mar 19 20:20:01 crc kubenswrapper[4799]: I0319 20:20:01.161524 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" event={"ID":"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3","Type":"ContainerStarted","Data":"5677b8ec55049bb18683ef473a7c4bdc7787dd4dd4d67faa74006d067cd3836e"} Mar 19 20:20:01 crc kubenswrapper[4799]: I0319 20:20:01.162858 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" event={"ID":"2fed39f2-bbdc-492f-be94-cdde1c1798ed","Type":"ContainerStarted","Data":"2346ede970686ed1907fc4ef4e90b265b93ed95522179ec1eed3d16aef7642df"} Mar 19 20:20:02 crc kubenswrapper[4799]: I0319 20:20:02.169880 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" event={"ID":"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3","Type":"ContainerStarted","Data":"eace55bab65617e599371db495c3c0d2544b5a2ad90e36223cd6228426053d67"} Mar 19 20:20:03 crc kubenswrapper[4799]: I0319 20:20:03.179757 4799 generic.go:334] "Generic (PLEG): container finished" podID="2fed39f2-bbdc-492f-be94-cdde1c1798ed" containerID="ab7a7269c37b8ab72c829be828b09c90cf05f52c872b4863329f824043350000" exitCode=0 Mar 19 20:20:03 crc kubenswrapper[4799]: I0319 20:20:03.180035 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" event={"ID":"2fed39f2-bbdc-492f-be94-cdde1c1798ed","Type":"ContainerDied","Data":"ab7a7269c37b8ab72c829be828b09c90cf05f52c872b4863329f824043350000"} Mar 19 20:20:03 crc kubenswrapper[4799]: I0319 20:20:03.191122 4799 generic.go:334] "Generic (PLEG): container finished" podID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerID="eace55bab65617e599371db495c3c0d2544b5a2ad90e36223cd6228426053d67" exitCode=0 Mar 19 20:20:03 crc kubenswrapper[4799]: I0319 20:20:03.191267 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" event={"ID":"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3","Type":"ContainerDied","Data":"eace55bab65617e599371db495c3c0d2544b5a2ad90e36223cd6228426053d67"} Mar 19 20:20:04 crc kubenswrapper[4799]: I0319 20:20:04.205018 4799 generic.go:334] "Generic (PLEG): container finished" podID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerID="89ef39f34ebc80a8e8148a9bf33820f529b9321abd266ac4ac023d956cd10d7d" exitCode=0 Mar 19 20:20:04 crc kubenswrapper[4799]: I0319 20:20:04.205738 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" event={"ID":"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3","Type":"ContainerDied","Data":"89ef39f34ebc80a8e8148a9bf33820f529b9321abd266ac4ac023d956cd10d7d"} Mar 19 20:20:04 crc kubenswrapper[4799]: I0319 20:20:04.534521 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:04 crc kubenswrapper[4799]: I0319 20:20:04.707005 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xq6s\" (UniqueName: \"kubernetes.io/projected/2fed39f2-bbdc-492f-be94-cdde1c1798ed-kube-api-access-7xq6s\") pod \"2fed39f2-bbdc-492f-be94-cdde1c1798ed\" (UID: \"2fed39f2-bbdc-492f-be94-cdde1c1798ed\") " Mar 19 20:20:04 crc kubenswrapper[4799]: I0319 20:20:04.713338 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fed39f2-bbdc-492f-be94-cdde1c1798ed-kube-api-access-7xq6s" (OuterVolumeSpecName: "kube-api-access-7xq6s") pod "2fed39f2-bbdc-492f-be94-cdde1c1798ed" (UID: "2fed39f2-bbdc-492f-be94-cdde1c1798ed"). InnerVolumeSpecName "kube-api-access-7xq6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:20:04 crc kubenswrapper[4799]: I0319 20:20:04.809198 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xq6s\" (UniqueName: \"kubernetes.io/projected/2fed39f2-bbdc-492f-be94-cdde1c1798ed-kube-api-access-7xq6s\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.216436 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" event={"ID":"2fed39f2-bbdc-492f-be94-cdde1c1798ed","Type":"ContainerDied","Data":"2346ede970686ed1907fc4ef4e90b265b93ed95522179ec1eed3d16aef7642df"} Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.216476 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-pf9sc" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.216511 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2346ede970686ed1907fc4ef4e90b265b93ed95522179ec1eed3d16aef7642df" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.594362 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-fjk2f"] Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.597607 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-fjk2f"] Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.601644 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.722565 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-util\") pod \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.722609 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsrgb\" (UniqueName: \"kubernetes.io/projected/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-kube-api-access-lsrgb\") pod \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.722725 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-bundle\") pod \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\" (UID: \"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3\") " Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.723964 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-bundle" (OuterVolumeSpecName: "bundle") pod "21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" (UID: "21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.725548 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-kube-api-access-lsrgb" (OuterVolumeSpecName: "kube-api-access-lsrgb") pod "21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" (UID: "21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3"). InnerVolumeSpecName "kube-api-access-lsrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.742819 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-util" (OuterVolumeSpecName: "util") pod "21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" (UID: "21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.824662 4799 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.824692 4799 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-util\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:05 crc kubenswrapper[4799]: I0319 20:20:05.824703 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsrgb\" (UniqueName: \"kubernetes.io/projected/21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3-kube-api-access-lsrgb\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:06 crc kubenswrapper[4799]: I0319 20:20:06.234377 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" event={"ID":"21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3","Type":"ContainerDied","Data":"5677b8ec55049bb18683ef473a7c4bdc7787dd4dd4d67faa74006d067cd3836e"} Mar 19 20:20:06 crc kubenswrapper[4799]: I0319 20:20:06.234482 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5677b8ec55049bb18683ef473a7c4bdc7787dd4dd4d67faa74006d067cd3836e" Mar 19 20:20:06 crc kubenswrapper[4799]: I0319 20:20:06.234520 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq" Mar 19 20:20:07 crc kubenswrapper[4799]: I0319 20:20:07.126786 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3482d963-d9c6-41f9-b382-486e75051602" path="/var/lib/kubelet/pods/3482d963-d9c6-41f9-b382-486e75051602/volumes" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.132436 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t"] Mar 19 20:20:12 crc kubenswrapper[4799]: E0319 20:20:12.132976 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="extract" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.132989 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="extract" Mar 19 20:20:12 crc kubenswrapper[4799]: E0319 20:20:12.133007 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fed39f2-bbdc-492f-be94-cdde1c1798ed" containerName="oc" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.133014 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fed39f2-bbdc-492f-be94-cdde1c1798ed" containerName="oc" Mar 19 20:20:12 crc kubenswrapper[4799]: E0319 20:20:12.133025 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="pull" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.133033 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="pull" Mar 19 20:20:12 crc kubenswrapper[4799]: E0319 20:20:12.133045 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="util" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.133052 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="util" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.133188 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3" containerName="extract" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.133209 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fed39f2-bbdc-492f-be94-cdde1c1798ed" containerName="oc" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.133720 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.135784 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-2jppl" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.175818 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prvj\" (UniqueName: \"kubernetes.io/projected/4fdcc365-1a41-47ce-8988-24a55f0bb8ac-kube-api-access-7prvj\") pod \"openstack-operator-controller-init-b85c4d696-47f2t\" (UID: \"4fdcc365-1a41-47ce-8988-24a55f0bb8ac\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.208488 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t"] Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.276802 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prvj\" (UniqueName: \"kubernetes.io/projected/4fdcc365-1a41-47ce-8988-24a55f0bb8ac-kube-api-access-7prvj\") pod \"openstack-operator-controller-init-b85c4d696-47f2t\" (UID: \"4fdcc365-1a41-47ce-8988-24a55f0bb8ac\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.301377 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prvj\" (UniqueName: \"kubernetes.io/projected/4fdcc365-1a41-47ce-8988-24a55f0bb8ac-kube-api-access-7prvj\") pod \"openstack-operator-controller-init-b85c4d696-47f2t\" (UID: \"4fdcc365-1a41-47ce-8988-24a55f0bb8ac\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.453778 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:12 crc kubenswrapper[4799]: I0319 20:20:12.906998 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t"] Mar 19 20:20:13 crc kubenswrapper[4799]: I0319 20:20:13.292318 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" event={"ID":"4fdcc365-1a41-47ce-8988-24a55f0bb8ac","Type":"ContainerStarted","Data":"a3b1fcd923e72da555a313a49e26aac941c71809e18a8be52cd78539bd6a92ad"} Mar 19 20:20:18 crc kubenswrapper[4799]: I0319 20:20:18.332984 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" event={"ID":"4fdcc365-1a41-47ce-8988-24a55f0bb8ac","Type":"ContainerStarted","Data":"0c1af4f850928912c12323c42f257dafce5e08d34bc335aa2457add413c34507"} Mar 19 20:20:18 crc kubenswrapper[4799]: I0319 20:20:18.333668 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:22 crc kubenswrapper[4799]: I0319 20:20:22.457731 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" Mar 19 20:20:22 crc kubenswrapper[4799]: I0319 20:20:22.486717 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-47f2t" podStartSLOduration=6.12519899 podStartE2EDuration="10.486699006s" podCreationTimestamp="2026-03-19 20:20:12 +0000 UTC" firstStartedPulling="2026-03-19 20:20:12.907717507 +0000 UTC m=+890.513670609" lastFinishedPulling="2026-03-19 20:20:17.269217553 +0000 UTC m=+894.875170625" observedRunningTime="2026-03-19 20:20:18.384886446 +0000 UTC m=+895.990839558" watchObservedRunningTime="2026-03-19 20:20:22.486699006 +0000 UTC m=+900.092652078" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.153927 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cqx8d"] Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.157051 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.178742 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqx8d"] Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.283932 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7blq\" (UniqueName: \"kubernetes.io/projected/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-kube-api-access-t7blq\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.284237 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-catalog-content\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.284420 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-utilities\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.385648 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7blq\" (UniqueName: \"kubernetes.io/projected/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-kube-api-access-t7blq\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.385739 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-catalog-content\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.385817 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-utilities\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.386484 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-utilities\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.386662 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-catalog-content\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.417316 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7blq\" (UniqueName: \"kubernetes.io/projected/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-kube-api-access-t7blq\") pod \"community-operators-cqx8d\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.482604 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:27 crc kubenswrapper[4799]: I0319 20:20:27.702734 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqx8d"] Mar 19 20:20:27 crc kubenswrapper[4799]: W0319 20:20:27.709979 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod565b2224_c5f9_47a8_a9cf_f7a0aa2815a5.slice/crio-937404f3cb69b2e0b3a673aac5c057ef6b8cef565f938904660b4e2021771bf7 WatchSource:0}: Error finding container 937404f3cb69b2e0b3a673aac5c057ef6b8cef565f938904660b4e2021771bf7: Status 404 returned error can't find the container with id 937404f3cb69b2e0b3a673aac5c057ef6b8cef565f938904660b4e2021771bf7 Mar 19 20:20:28 crc kubenswrapper[4799]: I0319 20:20:28.406206 4799 generic.go:334] "Generic (PLEG): container finished" podID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerID="a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62" exitCode=0 Mar 19 20:20:28 crc kubenswrapper[4799]: I0319 20:20:28.406244 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerDied","Data":"a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62"} Mar 19 20:20:28 crc kubenswrapper[4799]: I0319 20:20:28.406520 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerStarted","Data":"937404f3cb69b2e0b3a673aac5c057ef6b8cef565f938904660b4e2021771bf7"} Mar 19 20:20:28 crc kubenswrapper[4799]: I0319 20:20:28.756219 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:20:28 crc kubenswrapper[4799]: I0319 20:20:28.756318 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:20:29 crc kubenswrapper[4799]: I0319 20:20:29.413132 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerStarted","Data":"ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362"} Mar 19 20:20:30 crc kubenswrapper[4799]: I0319 20:20:30.422107 4799 generic.go:334] "Generic (PLEG): container finished" podID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerID="ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362" exitCode=0 Mar 19 20:20:30 crc kubenswrapper[4799]: I0319 20:20:30.422878 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerDied","Data":"ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362"} Mar 19 20:20:31 crc kubenswrapper[4799]: I0319 20:20:31.428805 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerStarted","Data":"f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790"} Mar 19 20:20:31 crc kubenswrapper[4799]: I0319 20:20:31.446634 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cqx8d" podStartSLOduration=1.907141969 podStartE2EDuration="4.446619775s" podCreationTimestamp="2026-03-19 20:20:27 +0000 UTC" firstStartedPulling="2026-03-19 20:20:28.407662982 +0000 UTC m=+906.013616054" lastFinishedPulling="2026-03-19 20:20:30.947140788 +0000 UTC m=+908.553093860" observedRunningTime="2026-03-19 20:20:31.445031004 +0000 UTC m=+909.050984076" watchObservedRunningTime="2026-03-19 20:20:31.446619775 +0000 UTC m=+909.052572847" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.528597 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xt9sh"] Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.530129 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.542743 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xt9sh"] Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.716099 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-utilities\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.716195 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-catalog-content\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.716237 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcc7\" (UniqueName: \"kubernetes.io/projected/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-kube-api-access-gqcc7\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.817106 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-utilities\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.817181 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-catalog-content\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.817232 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcc7\" (UniqueName: \"kubernetes.io/projected/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-kube-api-access-gqcc7\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.817950 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-catalog-content\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.818153 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-utilities\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.849011 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcc7\" (UniqueName: \"kubernetes.io/projected/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-kube-api-access-gqcc7\") pod \"certified-operators-xt9sh\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:34 crc kubenswrapper[4799]: I0319 20:20:34.903365 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:35 crc kubenswrapper[4799]: I0319 20:20:35.300905 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xt9sh"] Mar 19 20:20:35 crc kubenswrapper[4799]: W0319 20:20:35.309890 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ec3bd0_9b6e_4cd9_bca2_e1e75863a174.slice/crio-752e1585fedc2aa1d6fcdb753c7d12a57eb2ff361f14a864bb6ca3074c767c62 WatchSource:0}: Error finding container 752e1585fedc2aa1d6fcdb753c7d12a57eb2ff361f14a864bb6ca3074c767c62: Status 404 returned error can't find the container with id 752e1585fedc2aa1d6fcdb753c7d12a57eb2ff361f14a864bb6ca3074c767c62 Mar 19 20:20:35 crc kubenswrapper[4799]: I0319 20:20:35.451940 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xt9sh" event={"ID":"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174","Type":"ContainerStarted","Data":"752e1585fedc2aa1d6fcdb753c7d12a57eb2ff361f14a864bb6ca3074c767c62"} Mar 19 20:20:36 crc kubenswrapper[4799]: I0319 20:20:36.472362 4799 generic.go:334] "Generic (PLEG): container finished" podID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerID="d55deb3759a278a548e31f104fa87abe43f46bc837ffb3776e0ca75a25f3632d" exitCode=0 Mar 19 20:20:36 crc kubenswrapper[4799]: I0319 20:20:36.472425 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xt9sh" event={"ID":"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174","Type":"ContainerDied","Data":"d55deb3759a278a548e31f104fa87abe43f46bc837ffb3776e0ca75a25f3632d"} Mar 19 20:20:37 crc kubenswrapper[4799]: I0319 20:20:37.482998 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:37 crc kubenswrapper[4799]: I0319 20:20:37.483278 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:37 crc kubenswrapper[4799]: I0319 20:20:37.541883 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:38 crc kubenswrapper[4799]: I0319 20:20:38.032653 4799 scope.go:117] "RemoveContainer" containerID="027b544f1ac7ca66542ecf1d6f504a7e07577eb5791872fd86c10c27bc623cd1" Mar 19 20:20:38 crc kubenswrapper[4799]: I0319 20:20:38.483569 4799 generic.go:334] "Generic (PLEG): container finished" podID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerID="45b2042c3316b5e0aa64125dc1ed0908a44005ceacfde2031efa979ea9f59910" exitCode=0 Mar 19 20:20:38 crc kubenswrapper[4799]: I0319 20:20:38.483610 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xt9sh" event={"ID":"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174","Type":"ContainerDied","Data":"45b2042c3316b5e0aa64125dc1ed0908a44005ceacfde2031efa979ea9f59910"} Mar 19 20:20:38 crc kubenswrapper[4799]: I0319 20:20:38.525545 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:39 crc kubenswrapper[4799]: I0319 20:20:39.491694 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xt9sh" event={"ID":"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174","Type":"ContainerStarted","Data":"e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e"} Mar 19 20:20:39 crc kubenswrapper[4799]: I0319 20:20:39.515877 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xt9sh" podStartSLOduration=3.117124304 podStartE2EDuration="5.51586269s" podCreationTimestamp="2026-03-19 20:20:34 +0000 UTC" firstStartedPulling="2026-03-19 20:20:36.474126715 +0000 UTC m=+914.080079787" lastFinishedPulling="2026-03-19 20:20:38.872865101 +0000 UTC m=+916.478818173" observedRunningTime="2026-03-19 20:20:39.512710778 +0000 UTC m=+917.118663840" watchObservedRunningTime="2026-03-19 20:20:39.51586269 +0000 UTC m=+917.121815762" Mar 19 20:20:40 crc kubenswrapper[4799]: I0319 20:20:40.916659 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqx8d"] Mar 19 20:20:41 crc kubenswrapper[4799]: I0319 20:20:41.505271 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cqx8d" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="registry-server" containerID="cri-o://f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790" gracePeriod=2 Mar 19 20:20:41 crc kubenswrapper[4799]: I0319 20:20:41.882588 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.006896 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-catalog-content\") pod \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.007249 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-utilities\") pod \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.007293 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7blq\" (UniqueName: \"kubernetes.io/projected/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-kube-api-access-t7blq\") pod \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\" (UID: \"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5\") " Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.008163 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-utilities" (OuterVolumeSpecName: "utilities") pod "565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" (UID: "565b2224-c5f9-47a8-a9cf-f7a0aa2815a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.012457 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-kube-api-access-t7blq" (OuterVolumeSpecName: "kube-api-access-t7blq") pod "565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" (UID: "565b2224-c5f9-47a8-a9cf-f7a0aa2815a5"). InnerVolumeSpecName "kube-api-access-t7blq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.072028 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" (UID: "565b2224-c5f9-47a8-a9cf-f7a0aa2815a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.108927 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7blq\" (UniqueName: \"kubernetes.io/projected/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-kube-api-access-t7blq\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.108966 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.108977 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.513805 4799 generic.go:334] "Generic (PLEG): container finished" podID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerID="f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790" exitCode=0 Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.513847 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerDied","Data":"f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790"} Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.513876 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqx8d" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.513893 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqx8d" event={"ID":"565b2224-c5f9-47a8-a9cf-f7a0aa2815a5","Type":"ContainerDied","Data":"937404f3cb69b2e0b3a673aac5c057ef6b8cef565f938904660b4e2021771bf7"} Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.513914 4799 scope.go:117] "RemoveContainer" containerID="f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.532498 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf"] Mar 19 20:20:42 crc kubenswrapper[4799]: E0319 20:20:42.532761 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="registry-server" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.532781 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="registry-server" Mar 19 20:20:42 crc kubenswrapper[4799]: E0319 20:20:42.532795 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="extract-content" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.532804 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="extract-content" Mar 19 20:20:42 crc kubenswrapper[4799]: E0319 20:20:42.532827 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="extract-utilities" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.532836 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="extract-utilities" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.532964 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" containerName="registry-server" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.533446 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.534517 4799 scope.go:117] "RemoveContainer" containerID="ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.535368 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wjnhx" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.550771 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.556521 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.557420 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.561551 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-j54vq" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.569562 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.579440 4799 scope.go:117] "RemoveContainer" containerID="a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.582979 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.583792 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.585130 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-wmnw5" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.605327 4799 scope.go:117] "RemoveContainer" containerID="f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790" Mar 19 20:20:42 crc kubenswrapper[4799]: E0319 20:20:42.605770 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790\": container with ID starting with f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790 not found: ID does not exist" containerID="f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.605815 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790"} err="failed to get container status \"f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790\": rpc error: code = NotFound desc = could not find container \"f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790\": container with ID starting with f2b24b883b579b3de34f10ee3839fbed31d741fdc3b61a527877b437c1551790 not found: ID does not exist" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.605843 4799 scope.go:117] "RemoveContainer" containerID="ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362" Mar 19 20:20:42 crc kubenswrapper[4799]: E0319 20:20:42.606460 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362\": container with ID starting with ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362 not found: ID does not exist" containerID="ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.606484 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362"} err="failed to get container status \"ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362\": rpc error: code = NotFound desc = could not find container \"ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362\": container with ID starting with ce38d2b6f6af0f17c91ff91d1b191d67f31b3b6424c2386f7e547f4e273fe362 not found: ID does not exist" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.606501 4799 scope.go:117] "RemoveContainer" containerID="a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62" Mar 19 20:20:42 crc kubenswrapper[4799]: E0319 20:20:42.606817 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62\": container with ID starting with a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62 not found: ID does not exist" containerID="a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.606870 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62"} err="failed to get container status \"a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62\": rpc error: code = NotFound desc = could not find container \"a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62\": container with ID starting with a58f91cd3167b526308b0a3eca9ad48c394a2d639f3e19a024aebd5cb6510c62 not found: ID does not exist" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.614281 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjh5\" (UniqueName: \"kubernetes.io/projected/f5a6c547-0da9-4313-817a-9562fa9cb775-kube-api-access-gvjh5\") pod \"barbican-operator-controller-manager-59bc569d95-jphcf\" (UID: \"f5a6c547-0da9-4313-817a-9562fa9cb775\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.615758 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.616492 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.625920 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-x6vjp" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.638182 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.680237 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqx8d"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.689093 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cqx8d"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.694092 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.705374 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.706251 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.706670 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.706791 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.710846 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-sb9qp" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.710857 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-lgk8l" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.715313 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sbzh\" (UniqueName: \"kubernetes.io/projected/3e5cf32e-9b90-4518-86bb-5237dbf97e55-kube-api-access-7sbzh\") pod \"glance-operator-controller-manager-79df6bcc97-d5sq6\" (UID: \"3e5cf32e-9b90-4518-86bb-5237dbf97e55\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.715351 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sz56\" (UniqueName: \"kubernetes.io/projected/fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce-kube-api-access-4sz56\") pod \"cinder-operator-controller-manager-8d58dc466-xrrtz\" (UID: \"fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.715459 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjh5\" (UniqueName: \"kubernetes.io/projected/f5a6c547-0da9-4313-817a-9562fa9cb775-kube-api-access-gvjh5\") pod \"barbican-operator-controller-manager-59bc569d95-jphcf\" (UID: \"f5a6c547-0da9-4313-817a-9562fa9cb775\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.715484 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648pw\" (UniqueName: \"kubernetes.io/projected/8f841081-d1d8-464a-ae77-af76f0a109ea-kube-api-access-648pw\") pod \"designate-operator-controller-manager-588d4d986b-6j7gd\" (UID: \"8f841081-d1d8-464a-ae77-af76f0a109ea\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.727444 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.739173 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.740053 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.745165 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-h4w6l" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.745327 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.746085 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.750713 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-lch4x" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.769242 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjh5\" (UniqueName: \"kubernetes.io/projected/f5a6c547-0da9-4313-817a-9562fa9cb775-kube-api-access-gvjh5\") pod \"barbican-operator-controller-manager-59bc569d95-jphcf\" (UID: \"f5a6c547-0da9-4313-817a-9562fa9cb775\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.791639 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.792444 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.803700 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.804204 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cf8x9" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.806636 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.812348 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.818560 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.819207 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-648pw\" (UniqueName: \"kubernetes.io/projected/8f841081-d1d8-464a-ae77-af76f0a109ea-kube-api-access-648pw\") pod \"designate-operator-controller-manager-588d4d986b-6j7gd\" (UID: \"8f841081-d1d8-464a-ae77-af76f0a109ea\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.819276 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sbzh\" (UniqueName: \"kubernetes.io/projected/3e5cf32e-9b90-4518-86bb-5237dbf97e55-kube-api-access-7sbzh\") pod \"glance-operator-controller-manager-79df6bcc97-d5sq6\" (UID: \"3e5cf32e-9b90-4518-86bb-5237dbf97e55\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.819323 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sz56\" (UniqueName: \"kubernetes.io/projected/fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce-kube-api-access-4sz56\") pod \"cinder-operator-controller-manager-8d58dc466-xrrtz\" (UID: \"fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.819355 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7rcc\" (UniqueName: \"kubernetes.io/projected/1984ed7f-dd4a-43b7-b724-a902bccf7448-kube-api-access-h7rcc\") pod \"horizon-operator-controller-manager-8464cc45fb-kswht\" (UID: \"1984ed7f-dd4a-43b7-b724-a902bccf7448\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.819437 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kz9w\" (UniqueName: \"kubernetes.io/projected/1e9b69cc-5dc0-400e-9894-7ff0b173e6cb-kube-api-access-2kz9w\") pod \"heat-operator-controller-manager-67dd5f86f5-f7g26\" (UID: \"1e9b69cc-5dc0-400e-9894-7ff0b173e6cb\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.823347 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.838932 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-rhbks"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.841250 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.845799 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-d8fn4" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.845987 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-rhbks"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.848128 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.848871 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.855128 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lsgwx" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.870354 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sbzh\" (UniqueName: \"kubernetes.io/projected/3e5cf32e-9b90-4518-86bb-5237dbf97e55-kube-api-access-7sbzh\") pod \"glance-operator-controller-manager-79df6bcc97-d5sq6\" (UID: \"3e5cf32e-9b90-4518-86bb-5237dbf97e55\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.874426 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sz56\" (UniqueName: \"kubernetes.io/projected/fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce-kube-api-access-4sz56\") pod \"cinder-operator-controller-manager-8d58dc466-xrrtz\" (UID: \"fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.874536 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.878397 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.880203 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-648pw\" (UniqueName: \"kubernetes.io/projected/8f841081-d1d8-464a-ae77-af76f0a109ea-kube-api-access-648pw\") pod \"designate-operator-controller-manager-588d4d986b-6j7gd\" (UID: \"8f841081-d1d8-464a-ae77-af76f0a109ea\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.880423 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.889693 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qvd9k" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.893891 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.906902 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920424 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920461 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzc9\" (UniqueName: \"kubernetes.io/projected/310bb10d-00e4-4135-826e-43f7ca17bdf1-kube-api-access-5rzc9\") pod \"manila-operator-controller-manager-55f864c847-rhbks\" (UID: \"310bb10d-00e4-4135-826e-43f7ca17bdf1\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920497 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7rcc\" (UniqueName: \"kubernetes.io/projected/1984ed7f-dd4a-43b7-b724-a902bccf7448-kube-api-access-h7rcc\") pod \"horizon-operator-controller-manager-8464cc45fb-kswht\" (UID: \"1984ed7f-dd4a-43b7-b724-a902bccf7448\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920535 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786zj\" (UniqueName: \"kubernetes.io/projected/23a48119-c751-435f-9ed5-5a4b0dcf7ae0-kube-api-access-786zj\") pod \"ironic-operator-controller-manager-6f787dddc9-rrp47\" (UID: \"23a48119-c751-435f-9ed5-5a4b0dcf7ae0\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920556 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpc2\" (UniqueName: \"kubernetes.io/projected/771a024f-6f6e-43f1-82cb-076a70663c36-kube-api-access-xqpc2\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920575 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qh8\" (UniqueName: \"kubernetes.io/projected/52430659-ee8f-4143-a0cc-554487c4ee41-kube-api-access-w5qh8\") pod \"keystone-operator-controller-manager-768b96df4c-ct2xj\" (UID: \"52430659-ee8f-4143-a0cc-554487c4ee41\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.920598 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kz9w\" (UniqueName: \"kubernetes.io/projected/1e9b69cc-5dc0-400e-9894-7ff0b173e6cb-kube-api-access-2kz9w\") pod \"heat-operator-controller-manager-67dd5f86f5-f7g26\" (UID: \"1e9b69cc-5dc0-400e-9894-7ff0b173e6cb\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.932738 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.933604 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.936839 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wbs7z" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.938979 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.949408 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kz9w\" (UniqueName: \"kubernetes.io/projected/1e9b69cc-5dc0-400e-9894-7ff0b173e6cb-kube-api-access-2kz9w\") pod \"heat-operator-controller-manager-67dd5f86f5-f7g26\" (UID: \"1e9b69cc-5dc0-400e-9894-7ff0b173e6cb\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.969316 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7rcc\" (UniqueName: \"kubernetes.io/projected/1984ed7f-dd4a-43b7-b724-a902bccf7448-kube-api-access-h7rcc\") pod \"horizon-operator-controller-manager-8464cc45fb-kswht\" (UID: \"1984ed7f-dd4a-43b7-b724-a902bccf7448\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.977616 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz"] Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.978810 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.982534 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qs9tt" Mar 19 20:20:42 crc kubenswrapper[4799]: I0319 20:20:42.990084 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.011541 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021224 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786zj\" (UniqueName: \"kubernetes.io/projected/23a48119-c751-435f-9ed5-5a4b0dcf7ae0-kube-api-access-786zj\") pod \"ironic-operator-controller-manager-6f787dddc9-rrp47\" (UID: \"23a48119-c751-435f-9ed5-5a4b0dcf7ae0\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021296 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpc2\" (UniqueName: \"kubernetes.io/projected/771a024f-6f6e-43f1-82cb-076a70663c36-kube-api-access-xqpc2\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021322 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qh8\" (UniqueName: \"kubernetes.io/projected/52430659-ee8f-4143-a0cc-554487c4ee41-kube-api-access-w5qh8\") pod \"keystone-operator-controller-manager-768b96df4c-ct2xj\" (UID: \"52430659-ee8f-4143-a0cc-554487c4ee41\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021755 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wps6d\" (UniqueName: \"kubernetes.io/projected/5dffef78-a7d7-400c-8e1c-80fd01df4f07-kube-api-access-wps6d\") pod \"neutron-operator-controller-manager-767865f676-jqm8n\" (UID: \"5dffef78-a7d7-400c-8e1c-80fd01df4f07\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021823 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdqv8\" (UniqueName: \"kubernetes.io/projected/72be7e05-f329-4420-9632-3f6827c4e0e9-kube-api-access-xdqv8\") pod \"octavia-operator-controller-manager-5b9f45d989-6wvnr\" (UID: \"72be7e05-f329-4420-9632-3f6827c4e0e9\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021878 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021901 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzc9\" (UniqueName: \"kubernetes.io/projected/310bb10d-00e4-4135-826e-43f7ca17bdf1-kube-api-access-5rzc9\") pod \"manila-operator-controller-manager-55f864c847-rhbks\" (UID: \"310bb10d-00e4-4135-826e-43f7ca17bdf1\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.021953 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8g4x\" (UniqueName: \"kubernetes.io/projected/a0891d87-bed5-4b7b-bab9-653866be0678-kube-api-access-p8g4x\") pod \"mariadb-operator-controller-manager-67ccfc9778-j6bcm\" (UID: \"a0891d87-bed5-4b7b-bab9-653866be0678\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.022039 4799 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.022087 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert podName:771a024f-6f6e-43f1-82cb-076a70663c36 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:43.522071636 +0000 UTC m=+921.128024708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert") pod "infra-operator-controller-manager-7b9c774f96-cnhvk" (UID: "771a024f-6f6e-43f1-82cb-076a70663c36") : secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.027254 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.040803 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qh8\" (UniqueName: \"kubernetes.io/projected/52430659-ee8f-4143-a0cc-554487c4ee41-kube-api-access-w5qh8\") pod \"keystone-operator-controller-manager-768b96df4c-ct2xj\" (UID: \"52430659-ee8f-4143-a0cc-554487c4ee41\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.049875 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.050173 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.059532 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786zj\" (UniqueName: \"kubernetes.io/projected/23a48119-c751-435f-9ed5-5a4b0dcf7ae0-kube-api-access-786zj\") pod \"ironic-operator-controller-manager-6f787dddc9-rrp47\" (UID: \"23a48119-c751-435f-9ed5-5a4b0dcf7ae0\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.064517 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpc2\" (UniqueName: \"kubernetes.io/projected/771a024f-6f6e-43f1-82cb-076a70663c36-kube-api-access-xqpc2\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.072890 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.073293 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzc9\" (UniqueName: \"kubernetes.io/projected/310bb10d-00e4-4135-826e-43f7ca17bdf1-kube-api-access-5rzc9\") pod \"manila-operator-controller-manager-55f864c847-rhbks\" (UID: \"310bb10d-00e4-4135-826e-43f7ca17bdf1\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.082349 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.094061 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.096098 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.102954 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ntk8q" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.105299 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.106068 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.106700 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.118103 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-krc2q" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.118347 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.171526 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlv5\" (UniqueName: \"kubernetes.io/projected/b1950b27-4b38-4bd5-b858-fcf5aa82d7fd-kube-api-access-sdlv5\") pod \"nova-operator-controller-manager-5d488d59fb-5s2dz\" (UID: \"b1950b27-4b38-4bd5-b858-fcf5aa82d7fd\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.172184 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8g4x\" (UniqueName: \"kubernetes.io/projected/a0891d87-bed5-4b7b-bab9-653866be0678-kube-api-access-p8g4x\") pod \"mariadb-operator-controller-manager-67ccfc9778-j6bcm\" (UID: \"a0891d87-bed5-4b7b-bab9-653866be0678\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.172280 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.172448 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v76r2\" (UniqueName: \"kubernetes.io/projected/8fe20f94-3898-4826-8f94-a97f5d7619d6-kube-api-access-v76r2\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.172533 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wps6d\" (UniqueName: \"kubernetes.io/projected/5dffef78-a7d7-400c-8e1c-80fd01df4f07-kube-api-access-wps6d\") pod \"neutron-operator-controller-manager-767865f676-jqm8n\" (UID: \"5dffef78-a7d7-400c-8e1c-80fd01df4f07\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.172602 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdqv8\" (UniqueName: \"kubernetes.io/projected/72be7e05-f329-4420-9632-3f6827c4e0e9-kube-api-access-xdqv8\") pod \"octavia-operator-controller-manager-5b9f45d989-6wvnr\" (UID: \"72be7e05-f329-4420-9632-3f6827c4e0e9\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.182515 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565b2224-c5f9-47a8-a9cf-f7a0aa2815a5" path="/var/lib/kubelet/pods/565b2224-c5f9-47a8-a9cf-f7a0aa2815a5/volumes" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.183174 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.213906 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.214743 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.216633 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8g4x\" (UniqueName: \"kubernetes.io/projected/a0891d87-bed5-4b7b-bab9-653866be0678-kube-api-access-p8g4x\") pod \"mariadb-operator-controller-manager-67ccfc9778-j6bcm\" (UID: \"a0891d87-bed5-4b7b-bab9-653866be0678\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.219729 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.221113 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-kssvm" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.221208 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdqv8\" (UniqueName: \"kubernetes.io/projected/72be7e05-f329-4420-9632-3f6827c4e0e9-kube-api-access-xdqv8\") pod \"octavia-operator-controller-manager-5b9f45d989-6wvnr\" (UID: \"72be7e05-f329-4420-9632-3f6827c4e0e9\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.226156 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.229296 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.235037 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wps6d\" (UniqueName: \"kubernetes.io/projected/5dffef78-a7d7-400c-8e1c-80fd01df4f07-kube-api-access-wps6d\") pod \"neutron-operator-controller-manager-767865f676-jqm8n\" (UID: \"5dffef78-a7d7-400c-8e1c-80fd01df4f07\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.251201 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.270891 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.275298 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.276084 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln2vq\" (UniqueName: \"kubernetes.io/projected/879a3d02-050b-44dc-95ff-f4a010fe4739-kube-api-access-ln2vq\") pod \"ovn-operator-controller-manager-884679f54-gjj5c\" (UID: \"879a3d02-050b-44dc-95ff-f4a010fe4739\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.276173 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlv5\" (UniqueName: \"kubernetes.io/projected/b1950b27-4b38-4bd5-b858-fcf5aa82d7fd-kube-api-access-sdlv5\") pod \"nova-operator-controller-manager-5d488d59fb-5s2dz\" (UID: \"b1950b27-4b38-4bd5-b858-fcf5aa82d7fd\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.276210 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.276260 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v76r2\" (UniqueName: \"kubernetes.io/projected/8fe20f94-3898-4826-8f94-a97f5d7619d6-kube-api-access-v76r2\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.277085 4799 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.277155 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert podName:8fe20f94-3898-4826-8f94-a97f5d7619d6 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:43.777136863 +0000 UTC m=+921.383089935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899hpqp2" (UID: "8fe20f94-3898-4826-8f94-a97f5d7619d6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.277197 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.289222 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qscdk" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.301288 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v76r2\" (UniqueName: \"kubernetes.io/projected/8fe20f94-3898-4826-8f94-a97f5d7619d6-kube-api-access-v76r2\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.302604 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.305858 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.309666 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlv5\" (UniqueName: \"kubernetes.io/projected/b1950b27-4b38-4bd5-b858-fcf5aa82d7fd-kube-api-access-sdlv5\") pod \"nova-operator-controller-manager-5d488d59fb-5s2dz\" (UID: \"b1950b27-4b38-4bd5-b858-fcf5aa82d7fd\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.337813 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.338259 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.339157 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.342966 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.350266 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jptlv" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.365025 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.365880 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.368151 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-58ksg" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.377720 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln2vq\" (UniqueName: \"kubernetes.io/projected/879a3d02-050b-44dc-95ff-f4a010fe4739-kube-api-access-ln2vq\") pod \"ovn-operator-controller-manager-884679f54-gjj5c\" (UID: \"879a3d02-050b-44dc-95ff-f4a010fe4739\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.377757 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2pp\" (UniqueName: \"kubernetes.io/projected/9ac71989-25c8-4255-8612-a9f736ab50a1-kube-api-access-gk2pp\") pod \"placement-operator-controller-manager-5784578c99-4z6tx\" (UID: \"9ac71989-25c8-4255-8612-a9f736ab50a1\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.412008 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln2vq\" (UniqueName: \"kubernetes.io/projected/879a3d02-050b-44dc-95ff-f4a010fe4739-kube-api-access-ln2vq\") pod \"ovn-operator-controller-manager-884679f54-gjj5c\" (UID: \"879a3d02-050b-44dc-95ff-f4a010fe4739\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.417139 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.464328 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.465153 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.469544 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9zw86" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.474574 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.481234 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.484219 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.485055 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488133 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ff76\" (UniqueName: \"kubernetes.io/projected/85dcf72e-1669-4316-afe2-2dbf9059cd35-kube-api-access-6ff76\") pod \"swift-operator-controller-manager-c674c5965-jrtt2\" (UID: \"85dcf72e-1669-4316-afe2-2dbf9059cd35\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488194 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6p5\" (UniqueName: \"kubernetes.io/projected/3cac2ce5-3a90-4588-88ba-11557915c62c-kube-api-access-tw6p5\") pod \"test-operator-controller-manager-5c5cb9c4d7-4v4kl\" (UID: \"3cac2ce5-3a90-4588-88ba-11557915c62c\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488263 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2pp\" (UniqueName: \"kubernetes.io/projected/9ac71989-25c8-4255-8612-a9f736ab50a1-kube-api-access-gk2pp\") pod \"placement-operator-controller-manager-5784578c99-4z6tx\" (UID: \"9ac71989-25c8-4255-8612-a9f736ab50a1\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488288 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf9qt\" (UniqueName: \"kubernetes.io/projected/503586c4-3015-43d9-bb6e-56bef997c641-kube-api-access-hf9qt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-vp4rw\" (UID: \"503586c4-3015-43d9-bb6e-56bef997c641\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488306 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488325 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.488367 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l82sj\" (UniqueName: \"kubernetes.io/projected/d0525132-b508-40b7-a9eb-4773cfde1c32-kube-api-access-l82sj\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.489271 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzkg\" (UniqueName: \"kubernetes.io/projected/84e19238-e730-4bcf-9e09-1c6f3421a04d-kube-api-access-qxzkg\") pod \"telemetry-operator-controller-manager-d6b694c5-dhfqp\" (UID: \"84e19238-e730-4bcf-9e09-1c6f3421a04d\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.492558 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jwnxj" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.492701 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.492799 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.493046 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.506950 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.507809 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2pp\" (UniqueName: \"kubernetes.io/projected/9ac71989-25c8-4255-8612-a9f736ab50a1-kube-api-access-gk2pp\") pod \"placement-operator-controller-manager-5784578c99-4z6tx\" (UID: \"9ac71989-25c8-4255-8612-a9f736ab50a1\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.511054 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.516112 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-pvkjm" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.528249 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.567566 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592161 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prl8\" (UniqueName: \"kubernetes.io/projected/f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b-kube-api-access-6prl8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2scch\" (UID: \"f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592233 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6p5\" (UniqueName: \"kubernetes.io/projected/3cac2ce5-3a90-4588-88ba-11557915c62c-kube-api-access-tw6p5\") pod \"test-operator-controller-manager-5c5cb9c4d7-4v4kl\" (UID: \"3cac2ce5-3a90-4588-88ba-11557915c62c\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592293 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf9qt\" (UniqueName: \"kubernetes.io/projected/503586c4-3015-43d9-bb6e-56bef997c641-kube-api-access-hf9qt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-vp4rw\" (UID: \"503586c4-3015-43d9-bb6e-56bef997c641\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592317 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592349 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592409 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592445 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l82sj\" (UniqueName: \"kubernetes.io/projected/d0525132-b508-40b7-a9eb-4773cfde1c32-kube-api-access-l82sj\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592488 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzkg\" (UniqueName: \"kubernetes.io/projected/84e19238-e730-4bcf-9e09-1c6f3421a04d-kube-api-access-qxzkg\") pod \"telemetry-operator-controller-manager-d6b694c5-dhfqp\" (UID: \"84e19238-e730-4bcf-9e09-1c6f3421a04d\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.592520 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ff76\" (UniqueName: \"kubernetes.io/projected/85dcf72e-1669-4316-afe2-2dbf9059cd35-kube-api-access-6ff76\") pod \"swift-operator-controller-manager-c674c5965-jrtt2\" (UID: \"85dcf72e-1669-4316-afe2-2dbf9059cd35\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.592525 4799 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.592542 4799 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.592576 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:44.092560991 +0000 UTC m=+921.698514063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.592602 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:44.092583591 +0000 UTC m=+921.698536663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "metrics-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.592676 4799 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.592704 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert podName:771a024f-6f6e-43f1-82cb-076a70663c36 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:44.592697354 +0000 UTC m=+922.198650426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert") pod "infra-operator-controller-manager-7b9c774f96-cnhvk" (UID: "771a024f-6f6e-43f1-82cb-076a70663c36") : secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.609707 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6p5\" (UniqueName: \"kubernetes.io/projected/3cac2ce5-3a90-4588-88ba-11557915c62c-kube-api-access-tw6p5\") pod \"test-operator-controller-manager-5c5cb9c4d7-4v4kl\" (UID: \"3cac2ce5-3a90-4588-88ba-11557915c62c\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.611755 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf9qt\" (UniqueName: \"kubernetes.io/projected/503586c4-3015-43d9-bb6e-56bef997c641-kube-api-access-hf9qt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-vp4rw\" (UID: \"503586c4-3015-43d9-bb6e-56bef997c641\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.613825 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ff76\" (UniqueName: \"kubernetes.io/projected/85dcf72e-1669-4316-afe2-2dbf9059cd35-kube-api-access-6ff76\") pod \"swift-operator-controller-manager-c674c5965-jrtt2\" (UID: \"85dcf72e-1669-4316-afe2-2dbf9059cd35\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.615035 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l82sj\" (UniqueName: \"kubernetes.io/projected/d0525132-b508-40b7-a9eb-4773cfde1c32-kube-api-access-l82sj\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.615635 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzkg\" (UniqueName: \"kubernetes.io/projected/84e19238-e730-4bcf-9e09-1c6f3421a04d-kube-api-access-qxzkg\") pod \"telemetry-operator-controller-manager-d6b694c5-dhfqp\" (UID: \"84e19238-e730-4bcf-9e09-1c6f3421a04d\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.656015 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.664071 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd"] Mar 19 20:20:43 crc kubenswrapper[4799]: W0319 20:20:43.676813 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a6c547_0da9_4313_817a_9562fa9cb775.slice/crio-b6368a716f306eaf78eb5b9ac61ac1cc05d3882ad47f74d7def550e393481143 WatchSource:0}: Error finding container b6368a716f306eaf78eb5b9ac61ac1cc05d3882ad47f74d7def550e393481143: Status 404 returned error can't find the container with id b6368a716f306eaf78eb5b9ac61ac1cc05d3882ad47f74d7def550e393481143 Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.681351 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.702324 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prl8\" (UniqueName: \"kubernetes.io/projected/f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b-kube-api-access-6prl8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2scch\" (UID: \"f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.713073 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.718349 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prl8\" (UniqueName: \"kubernetes.io/projected/f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b-kube-api-access-6prl8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2scch\" (UID: \"f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.781730 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.804750 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.804946 4799 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: E0319 20:20:43.805036 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert podName:8fe20f94-3898-4826-8f94-a97f5d7619d6 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:44.805016715 +0000 UTC m=+922.410969787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899hpqp2" (UID: "8fe20f94-3898-4826-8f94-a97f5d7619d6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.819162 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz"] Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.850316 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" Mar 19 20:20:43 crc kubenswrapper[4799]: I0319 20:20:43.909467 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.108836 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.108934 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.109207 4799 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.109281 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:45.109255403 +0000 UTC m=+922.715208515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "webhook-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.109845 4799 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.109896 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:45.109880579 +0000 UTC m=+922.715833681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "metrics-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.278525 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26"] Mar 19 20:20:44 crc kubenswrapper[4799]: W0319 20:20:44.297546 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52430659_ee8f_4143_a0cc_554487c4ee41.slice/crio-79f7666c12879b191be5fe1a0794e7cfea2d4ce99f5e164d212c37fce5f2caea WatchSource:0}: Error finding container 79f7666c12879b191be5fe1a0794e7cfea2d4ce99f5e164d212c37fce5f2caea: Status 404 returned error can't find the container with id 79f7666c12879b191be5fe1a0794e7cfea2d4ce99f5e164d212c37fce5f2caea Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.302867 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj"] Mar 19 20:20:44 crc kubenswrapper[4799]: W0319 20:20:44.323974 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310bb10d_00e4_4135_826e_43f7ca17bdf1.slice/crio-26a94b3fd272f73a9533c2048381cc7bd531097906c7efe82e8e6e1472c090c9 WatchSource:0}: Error finding container 26a94b3fd272f73a9533c2048381cc7bd531097906c7efe82e8e6e1472c090c9: Status 404 returned error can't find the container with id 26a94b3fd272f73a9533c2048381cc7bd531097906c7efe82e8e6e1472c090c9 Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.332924 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-rhbks"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.349089 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.359905 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6"] Mar 19 20:20:44 crc kubenswrapper[4799]: W0319 20:20:44.366626 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879a3d02_050b_44dc_95ff_f4a010fe4739.slice/crio-75423bebede04e7639a69428a43e78c7ad2fd66416ba7b666459e770048257fd WatchSource:0}: Error finding container 75423bebede04e7639a69428a43e78c7ad2fd66416ba7b666459e770048257fd: Status 404 returned error can't find the container with id 75423bebede04e7639a69428a43e78c7ad2fd66416ba7b666459e770048257fd Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.366664 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.371085 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.375486 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.379647 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl"] Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.382909 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p8g4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-j6bcm_openstack-operators(a0891d87-bed5-4b7b-bab9-653866be0678): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.383505 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch"] Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.384460 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" podUID="a0891d87-bed5-4b7b-bab9-653866be0678" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.385788 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h7rcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-kswht_openstack-operators(1984ed7f-dd4a-43b7-b724-a902bccf7448): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.386898 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" podUID="1984ed7f-dd4a-43b7-b724-a902bccf7448" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.390018 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz"] Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.393586 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gk2pp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-4z6tx_openstack-operators(9ac71989-25c8-4255-8612-a9f736ab50a1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.394472 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sdlv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-5s2dz_openstack-operators(b1950b27-4b38-4bd5-b858-fcf5aa82d7fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.394640 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw"] Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.394715 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" podUID="9ac71989-25c8-4255-8612-a9f736ab50a1" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.396147 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" podUID="b1950b27-4b38-4bd5-b858-fcf5aa82d7fd" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.397142 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qxzkg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-dhfqp_openstack-operators(84e19238-e730-4bcf-9e09-1c6f3421a04d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.397240 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hf9qt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-vp4rw_openstack-operators(503586c4-3015-43d9-bb6e-56bef997c641): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.398218 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" podUID="84e19238-e730-4bcf-9e09-1c6f3421a04d" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.399519 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" podUID="503586c4-3015-43d9-bb6e-56bef997c641" Mar 19 20:20:44 crc kubenswrapper[4799]: W0319 20:20:44.401734 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85dcf72e_1669_4316_afe2_2dbf9059cd35.slice/crio-12b38a698ee2a3b993fcafdae3974613da01a9d365cf13c99dcba24fbe881e94 WatchSource:0}: Error finding container 12b38a698ee2a3b993fcafdae3974613da01a9d365cf13c99dcba24fbe881e94: Status 404 returned error can't find the container with id 12b38a698ee2a3b993fcafdae3974613da01a9d365cf13c99dcba24fbe881e94 Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.404201 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c"] Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.404932 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6ff76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-jrtt2_openstack-operators(85dcf72e-1669-4316-afe2-2dbf9059cd35): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.406089 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" podUID="85dcf72e-1669-4316-afe2-2dbf9059cd35" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.408947 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.414324 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.418879 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.424607 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2"] Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.537763 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" event={"ID":"3cac2ce5-3a90-4588-88ba-11557915c62c","Type":"ContainerStarted","Data":"d151bf4620c0d3be94acf95d18606e7e5232076bd597c53827383ec9c1bc12f0"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.539351 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" event={"ID":"8f841081-d1d8-464a-ae77-af76f0a109ea","Type":"ContainerStarted","Data":"b60738b7fc5487a91b90068db61c4755ec6c68e94d6e18bc253d018cb97b756f"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.540899 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" event={"ID":"72be7e05-f329-4420-9632-3f6827c4e0e9","Type":"ContainerStarted","Data":"73bc3cf00a43149f2e27b30de40d6a83b60a6226bfbb5159f3a04425d462d75f"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.542265 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" event={"ID":"1984ed7f-dd4a-43b7-b724-a902bccf7448","Type":"ContainerStarted","Data":"d7ec3ac1e2b79de86f9b26e35d4fe40377df9c18368e065bd0b68f9c495dfdd5"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.543749 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" podUID="1984ed7f-dd4a-43b7-b724-a902bccf7448" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.544937 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" event={"ID":"b1950b27-4b38-4bd5-b858-fcf5aa82d7fd","Type":"ContainerStarted","Data":"d172088f0dfe8ae7571f09834fffed84ae2f16ebdeca106cd6cfbb177534f576"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.546512 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" podUID="b1950b27-4b38-4bd5-b858-fcf5aa82d7fd" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.546848 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" event={"ID":"f5a6c547-0da9-4313-817a-9562fa9cb775","Type":"ContainerStarted","Data":"b6368a716f306eaf78eb5b9ac61ac1cc05d3882ad47f74d7def550e393481143"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.549280 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" event={"ID":"503586c4-3015-43d9-bb6e-56bef997c641","Type":"ContainerStarted","Data":"033e9164964887b0fcb1991511e9690fe8562f11b3f0e5241032d931138724c6"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.552258 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" podUID="503586c4-3015-43d9-bb6e-56bef997c641" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.552406 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" event={"ID":"1e9b69cc-5dc0-400e-9894-7ff0b173e6cb","Type":"ContainerStarted","Data":"f4935bb1b4d3b9a748847d5c30aa593b1dbcb033f13cb00315b7ec75bb762055"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.559482 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" event={"ID":"f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b","Type":"ContainerStarted","Data":"b704ae70c34a2c479b8d653cb17a899032f4e4f0cb29ba8d17345f06fedf29db"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.561165 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" event={"ID":"310bb10d-00e4-4135-826e-43f7ca17bdf1","Type":"ContainerStarted","Data":"26a94b3fd272f73a9533c2048381cc7bd531097906c7efe82e8e6e1472c090c9"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.562459 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" event={"ID":"fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce","Type":"ContainerStarted","Data":"5d74e870bc4b67d9fbe53daadfe25933286d49e0e2626626ceea4a9391c734b8"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.565450 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" event={"ID":"85dcf72e-1669-4316-afe2-2dbf9059cd35","Type":"ContainerStarted","Data":"12b38a698ee2a3b993fcafdae3974613da01a9d365cf13c99dcba24fbe881e94"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.568353 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" podUID="85dcf72e-1669-4316-afe2-2dbf9059cd35" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.569161 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" event={"ID":"a0891d87-bed5-4b7b-bab9-653866be0678","Type":"ContainerStarted","Data":"238ae3409e9dd524bbbd78155d8e83b160827bb6eacd6f5d68f49ddd9372ea13"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.570409 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" podUID="a0891d87-bed5-4b7b-bab9-653866be0678" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.570612 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" event={"ID":"9ac71989-25c8-4255-8612-a9f736ab50a1","Type":"ContainerStarted","Data":"a7c40abde4aa2c59a6176a17c7b02f08a79aa24546b9d1a3be9c14de18c13db4"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.571975 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" podUID="9ac71989-25c8-4255-8612-a9f736ab50a1" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.572838 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" event={"ID":"879a3d02-050b-44dc-95ff-f4a010fe4739","Type":"ContainerStarted","Data":"75423bebede04e7639a69428a43e78c7ad2fd66416ba7b666459e770048257fd"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.574360 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" event={"ID":"52430659-ee8f-4143-a0cc-554487c4ee41","Type":"ContainerStarted","Data":"79f7666c12879b191be5fe1a0794e7cfea2d4ce99f5e164d212c37fce5f2caea"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.575854 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" event={"ID":"3e5cf32e-9b90-4518-86bb-5237dbf97e55","Type":"ContainerStarted","Data":"294488baafc5a005cad8b5092e2a6c776746c608240679dd1cbc62ec2c18393e"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.577062 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" event={"ID":"84e19238-e730-4bcf-9e09-1c6f3421a04d","Type":"ContainerStarted","Data":"165bb06e99c8291f5d2e418bb361fe4ca7c0057ec6d6e148096f56badfc2ca26"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.578288 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" event={"ID":"23a48119-c751-435f-9ed5-5a4b0dcf7ae0","Type":"ContainerStarted","Data":"13d08510bc726e4194f4d5d02888bd27aaa387c5537337c6d5e26f815f0473dd"} Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.578830 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" podUID="84e19238-e730-4bcf-9e09-1c6f3421a04d" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.579576 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" event={"ID":"5dffef78-a7d7-400c-8e1c-80fd01df4f07","Type":"ContainerStarted","Data":"c2dfe66b607f7642e546a5028e0dbe02d3c1a56bbbafa512d31807c876fb0ada"} Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.614667 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.615233 4799 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.615276 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert podName:771a024f-6f6e-43f1-82cb-076a70663c36 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:46.61526222 +0000 UTC m=+924.221215292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert") pod "infra-operator-controller-manager-7b9c774f96-cnhvk" (UID: "771a024f-6f6e-43f1-82cb-076a70663c36") : secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.816418 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.816578 4799 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: E0319 20:20:44.816680 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert podName:8fe20f94-3898-4826-8f94-a97f5d7619d6 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:46.816659428 +0000 UTC m=+924.422612500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899hpqp2" (UID: "8fe20f94-3898-4826-8f94-a97f5d7619d6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.903821 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.904129 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:44 crc kubenswrapper[4799]: I0319 20:20:44.950026 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:45 crc kubenswrapper[4799]: I0319 20:20:45.121072 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:45 crc kubenswrapper[4799]: I0319 20:20:45.121120 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.121297 4799 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.121342 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:47.121328758 +0000 UTC m=+924.727281830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "webhook-server-cert" not found Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.121991 4799 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.122015 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:47.122008245 +0000 UTC m=+924.727961317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "metrics-server-cert" not found Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592408 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" podUID="503586c4-3015-43d9-bb6e-56bef997c641" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592547 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" podUID="a0891d87-bed5-4b7b-bab9-653866be0678" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592547 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" podUID="9ac71989-25c8-4255-8612-a9f736ab50a1" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592576 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" podUID="85dcf72e-1669-4316-afe2-2dbf9059cd35" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592595 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" podUID="b1950b27-4b38-4bd5-b858-fcf5aa82d7fd" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592612 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" podUID="1984ed7f-dd4a-43b7-b724-a902bccf7448" Mar 19 20:20:45 crc kubenswrapper[4799]: E0319 20:20:45.592623 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" podUID="84e19238-e730-4bcf-9e09-1c6f3421a04d" Mar 19 20:20:45 crc kubenswrapper[4799]: I0319 20:20:45.666577 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:20:45 crc kubenswrapper[4799]: I0319 20:20:45.917307 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xt9sh"] Mar 19 20:20:46 crc kubenswrapper[4799]: I0319 20:20:46.644225 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:46 crc kubenswrapper[4799]: E0319 20:20:46.644434 4799 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:46 crc kubenswrapper[4799]: E0319 20:20:46.644534 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert podName:771a024f-6f6e-43f1-82cb-076a70663c36 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:50.644514419 +0000 UTC m=+928.250467491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert") pod "infra-operator-controller-manager-7b9c774f96-cnhvk" (UID: "771a024f-6f6e-43f1-82cb-076a70663c36") : secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:46 crc kubenswrapper[4799]: I0319 20:20:46.847185 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:46 crc kubenswrapper[4799]: E0319 20:20:46.847312 4799 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:46 crc kubenswrapper[4799]: E0319 20:20:46.847362 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert podName:8fe20f94-3898-4826-8f94-a97f5d7619d6 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:50.847349645 +0000 UTC m=+928.453302717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899hpqp2" (UID: "8fe20f94-3898-4826-8f94-a97f5d7619d6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:47 crc kubenswrapper[4799]: I0319 20:20:47.151465 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:47 crc kubenswrapper[4799]: I0319 20:20:47.151512 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:47 crc kubenswrapper[4799]: E0319 20:20:47.151621 4799 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 20:20:47 crc kubenswrapper[4799]: E0319 20:20:47.151688 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:51.151671595 +0000 UTC m=+928.757624667 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "metrics-server-cert" not found Mar 19 20:20:47 crc kubenswrapper[4799]: E0319 20:20:47.151693 4799 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 20:20:47 crc kubenswrapper[4799]: E0319 20:20:47.151776 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:51.151755108 +0000 UTC m=+928.757708250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "webhook-server-cert" not found Mar 19 20:20:47 crc kubenswrapper[4799]: I0319 20:20:47.602402 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xt9sh" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="registry-server" containerID="cri-o://e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e" gracePeriod=2 Mar 19 20:20:48 crc kubenswrapper[4799]: I0319 20:20:48.613110 4799 generic.go:334] "Generic (PLEG): container finished" podID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerID="e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e" exitCode=0 Mar 19 20:20:48 crc kubenswrapper[4799]: I0319 20:20:48.613149 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xt9sh" event={"ID":"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174","Type":"ContainerDied","Data":"e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e"} Mar 19 20:20:50 crc kubenswrapper[4799]: I0319 20:20:50.700413 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:50 crc kubenswrapper[4799]: E0319 20:20:50.700550 4799 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:50 crc kubenswrapper[4799]: E0319 20:20:50.701008 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert podName:771a024f-6f6e-43f1-82cb-076a70663c36 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:58.700958496 +0000 UTC m=+936.306911578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert") pod "infra-operator-controller-manager-7b9c774f96-cnhvk" (UID: "771a024f-6f6e-43f1-82cb-076a70663c36") : secret "infra-operator-webhook-server-cert" not found Mar 19 20:20:50 crc kubenswrapper[4799]: I0319 20:20:50.904218 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:50 crc kubenswrapper[4799]: E0319 20:20:50.904592 4799 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:50 crc kubenswrapper[4799]: E0319 20:20:50.904730 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert podName:8fe20f94-3898-4826-8f94-a97f5d7619d6 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:58.904699095 +0000 UTC m=+936.510652197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899hpqp2" (UID: "8fe20f94-3898-4826-8f94-a97f5d7619d6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:51 crc kubenswrapper[4799]: I0319 20:20:51.209235 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:51 crc kubenswrapper[4799]: I0319 20:20:51.209294 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:51 crc kubenswrapper[4799]: E0319 20:20:51.209543 4799 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 20:20:51 crc kubenswrapper[4799]: E0319 20:20:51.209665 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:59.209637632 +0000 UTC m=+936.815590734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "metrics-server-cert" not found Mar 19 20:20:51 crc kubenswrapper[4799]: E0319 20:20:51.209678 4799 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 20:20:51 crc kubenswrapper[4799]: E0319 20:20:51.209754 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:20:59.209735234 +0000 UTC m=+936.815688306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "webhook-server-cert" not found Mar 19 20:20:54 crc kubenswrapper[4799]: E0319 20:20:54.904712 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e is running failed: container process not found" containerID="e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:20:54 crc kubenswrapper[4799]: E0319 20:20:54.905555 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e is running failed: container process not found" containerID="e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:20:54 crc kubenswrapper[4799]: E0319 20:20:54.906000 4799 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e is running failed: container process not found" containerID="e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:20:54 crc kubenswrapper[4799]: E0319 20:20:54.906045 4799 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-xt9sh" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="registry-server" Mar 19 20:20:57 crc kubenswrapper[4799]: E0319 20:20:57.439441 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 19 20:20:57 crc kubenswrapper[4799]: E0319 20:20:57.439845 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ln2vq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-gjj5c_openstack-operators(879a3d02-050b-44dc-95ff-f4a010fe4739): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:20:57 crc kubenswrapper[4799]: E0319 20:20:57.441082 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" podUID="879a3d02-050b-44dc-95ff-f4a010fe4739" Mar 19 20:20:57 crc kubenswrapper[4799]: E0319 20:20:57.682259 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" podUID="879a3d02-050b-44dc-95ff-f4a010fe4739" Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.755967 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.756429 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.756492 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.757344 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13b6fab9ed6c0d9132855fd64438b4c86e33e17491f418536e545f20790a7c7a"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.757493 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://13b6fab9ed6c0d9132855fd64438b4c86e33e17491f418536e545f20790a7c7a" gracePeriod=600 Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.765070 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.775764 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/771a024f-6f6e-43f1-82cb-076a70663c36-cert\") pod \"infra-operator-controller-manager-7b9c774f96-cnhvk\" (UID: \"771a024f-6f6e-43f1-82cb-076a70663c36\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:58 crc kubenswrapper[4799]: I0319 20:20:58.968033 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:20:58 crc kubenswrapper[4799]: E0319 20:20:58.968353 4799 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:58 crc kubenswrapper[4799]: E0319 20:20:58.968487 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert podName:8fe20f94-3898-4826-8f94-a97f5d7619d6 nodeName:}" failed. No retries permitted until 2026-03-19 20:21:14.968460299 +0000 UTC m=+952.574413411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899hpqp2" (UID: "8fe20f94-3898-4826-8f94-a97f5d7619d6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 20:20:59 crc kubenswrapper[4799]: I0319 20:20:59.026348 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:20:59 crc kubenswrapper[4799]: I0319 20:20:59.273667 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:59 crc kubenswrapper[4799]: I0319 20:20:59.273716 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:20:59 crc kubenswrapper[4799]: E0319 20:20:59.273854 4799 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 20:20:59 crc kubenswrapper[4799]: E0319 20:20:59.273923 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:21:15.273903148 +0000 UTC m=+952.879856240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "metrics-server-cert" not found Mar 19 20:20:59 crc kubenswrapper[4799]: E0319 20:20:59.274020 4799 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 20:20:59 crc kubenswrapper[4799]: E0319 20:20:59.274161 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs podName:d0525132-b508-40b7-a9eb-4773cfde1c32 nodeName:}" failed. No retries permitted until 2026-03-19 20:21:15.274129604 +0000 UTC m=+952.880082706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-bbbxq" (UID: "d0525132-b508-40b7-a9eb-4773cfde1c32") : secret "webhook-server-cert" not found Mar 19 20:20:59 crc kubenswrapper[4799]: I0319 20:20:59.700339 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="13b6fab9ed6c0d9132855fd64438b4c86e33e17491f418536e545f20790a7c7a" exitCode=0 Mar 19 20:20:59 crc kubenswrapper[4799]: I0319 20:20:59.700454 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"13b6fab9ed6c0d9132855fd64438b4c86e33e17491f418536e545f20790a7c7a"} Mar 19 20:20:59 crc kubenswrapper[4799]: I0319 20:20:59.701075 4799 scope.go:117] "RemoveContainer" containerID="74d053865b82ea1997ab8189b4caa9e61f0e0eecb338b66d10379da64224a7e7" Mar 19 20:21:00 crc kubenswrapper[4799]: E0319 20:21:00.697372 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 19 20:21:00 crc kubenswrapper[4799]: E0319 20:21:00.697684 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tw6p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-4v4kl_openstack-operators(3cac2ce5-3a90-4588-88ba-11557915c62c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:21:00 crc kubenswrapper[4799]: E0319 20:21:00.699600 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" podUID="3cac2ce5-3a90-4588-88ba-11557915c62c" Mar 19 20:21:00 crc kubenswrapper[4799]: E0319 20:21:00.709591 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" podUID="3cac2ce5-3a90-4588-88ba-11557915c62c" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.591610 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.709930 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcc7\" (UniqueName: \"kubernetes.io/projected/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-kube-api-access-gqcc7\") pod \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.709992 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-utilities\") pod \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.710037 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-catalog-content\") pod \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\" (UID: \"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174\") " Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.711112 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-utilities" (OuterVolumeSpecName: "utilities") pod "10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" (UID: "10ec3bd0-9b6e-4cd9-bca2-e1e75863a174"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.717989 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xt9sh" event={"ID":"10ec3bd0-9b6e-4cd9-bca2-e1e75863a174","Type":"ContainerDied","Data":"752e1585fedc2aa1d6fcdb753c7d12a57eb2ff361f14a864bb6ca3074c767c62"} Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.718093 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xt9sh" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.730782 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-kube-api-access-gqcc7" (OuterVolumeSpecName: "kube-api-access-gqcc7") pod "10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" (UID: "10ec3bd0-9b6e-4cd9-bca2-e1e75863a174"). InnerVolumeSpecName "kube-api-access-gqcc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.762351 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" (UID: "10ec3bd0-9b6e-4cd9-bca2-e1e75863a174"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.812262 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqcc7\" (UniqueName: \"kubernetes.io/projected/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-kube-api-access-gqcc7\") on node \"crc\" DevicePath \"\"" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.812297 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:21:01 crc kubenswrapper[4799]: I0319 20:21:01.812310 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.047267 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xt9sh"] Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.052361 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xt9sh"] Mar 19 20:21:02 crc kubenswrapper[4799]: E0319 20:21:02.295214 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 19 20:21:02 crc kubenswrapper[4799]: E0319 20:21:02.295438 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w5qh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-ct2xj_openstack-operators(52430659-ee8f-4143-a0cc-554487c4ee41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:21:02 crc kubenswrapper[4799]: E0319 20:21:02.297325 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" podUID="52430659-ee8f-4143-a0cc-554487c4ee41" Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.315693 4799 scope.go:117] "RemoveContainer" containerID="e07edf163a01f15f339889a2077a13e6d6dce8b841210b0d210c59852e8abd5e" Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.383685 4799 scope.go:117] "RemoveContainer" containerID="45b2042c3316b5e0aa64125dc1ed0908a44005ceacfde2031efa979ea9f59910" Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.409180 4799 scope.go:117] "RemoveContainer" containerID="d55deb3759a278a548e31f104fa87abe43f46bc837ffb3776e0ca75a25f3632d" Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.761483 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"ebf1a7c33ca2e5a253f33af42655dacb79a5142f188b73cc17c9ab070ccc29c9"} Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.765329 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" event={"ID":"8f841081-d1d8-464a-ae77-af76f0a109ea","Type":"ContainerStarted","Data":"bdd58049ab214abd8a100cc5a18d146e6d98e1e47e5ff663eb9aefda087ec0a8"} Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.765415 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:21:02 crc kubenswrapper[4799]: E0319 20:21:02.768235 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" podUID="52430659-ee8f-4143-a0cc-554487c4ee41" Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.770120 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk"] Mar 19 20:21:02 crc kubenswrapper[4799]: I0319 20:21:02.819576 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" podStartSLOduration=8.524255517 podStartE2EDuration="20.819561515s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:43.680569097 +0000 UTC m=+921.286522169" lastFinishedPulling="2026-03-19 20:20:55.975875065 +0000 UTC m=+933.581828167" observedRunningTime="2026-03-19 20:21:02.816216928 +0000 UTC m=+940.422170000" watchObservedRunningTime="2026-03-19 20:21:02.819561515 +0000 UTC m=+940.425514587" Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.126910 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" path="/var/lib/kubelet/pods/10ec3bd0-9b6e-4cd9-bca2-e1e75863a174/volumes" Mar 19 20:21:03 crc kubenswrapper[4799]: W0319 20:21:03.265442 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771a024f_6f6e_43f1_82cb_076a70663c36.slice/crio-def50d65d5c82ff8d89edf2a22979e7ff32f4257f4c03c1aa0ec04027f662003 WatchSource:0}: Error finding container def50d65d5c82ff8d89edf2a22979e7ff32f4257f4c03c1aa0ec04027f662003: Status 404 returned error can't find the container with id def50d65d5c82ff8d89edf2a22979e7ff32f4257f4c03c1aa0ec04027f662003 Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.774934 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" event={"ID":"f5a6c547-0da9-4313-817a-9562fa9cb775","Type":"ContainerStarted","Data":"0c6eacc8100ee976b82aac6e953babc269f34afc1e9469dbea2dadab17a55760"} Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.775062 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.776103 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" event={"ID":"771a024f-6f6e-43f1-82cb-076a70663c36","Type":"ContainerStarted","Data":"def50d65d5c82ff8d89edf2a22979e7ff32f4257f4c03c1aa0ec04027f662003"} Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.777179 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" event={"ID":"23a48119-c751-435f-9ed5-5a4b0dcf7ae0","Type":"ContainerStarted","Data":"d6253e46610870d932413650e09f7762a0f8593d342b1978018a1dfe2cafe1db"} Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.777313 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.790924 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" event={"ID":"72be7e05-f329-4420-9632-3f6827c4e0e9","Type":"ContainerStarted","Data":"00b06c7ef23d7048f3905ed0b33e163b51a670c3ebebda613cb2d6329d5a150c"} Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.791307 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.830982 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" podStartSLOduration=4.99638755 podStartE2EDuration="21.830961692s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:43.676907932 +0000 UTC m=+921.282861004" lastFinishedPulling="2026-03-19 20:21:00.511482034 +0000 UTC m=+938.117435146" observedRunningTime="2026-03-19 20:21:03.829804612 +0000 UTC m=+941.435757684" watchObservedRunningTime="2026-03-19 20:21:03.830961692 +0000 UTC m=+941.436914764" Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.856196 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" podStartSLOduration=4.6900742189999995 podStartE2EDuration="21.856176424s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.359336481 +0000 UTC m=+921.965289553" lastFinishedPulling="2026-03-19 20:21:01.525438686 +0000 UTC m=+939.131391758" observedRunningTime="2026-03-19 20:21:03.852475018 +0000 UTC m=+941.458428090" watchObservedRunningTime="2026-03-19 20:21:03.856176424 +0000 UTC m=+941.462129486" Mar 19 20:21:03 crc kubenswrapper[4799]: I0319 20:21:03.882839 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" podStartSLOduration=3.886578659 podStartE2EDuration="21.882823423s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.325763203 +0000 UTC m=+921.931716275" lastFinishedPulling="2026-03-19 20:21:02.322007967 +0000 UTC m=+939.927961039" observedRunningTime="2026-03-19 20:21:03.878812789 +0000 UTC m=+941.484765871" watchObservedRunningTime="2026-03-19 20:21:03.882823423 +0000 UTC m=+941.488776495" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.844071 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" event={"ID":"503586c4-3015-43d9-bb6e-56bef997c641","Type":"ContainerStarted","Data":"507521bf9ba4f8bcc322fcb30093204799c5167637baec1573cc8a33023302cd"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.844705 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.847107 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" event={"ID":"1e9b69cc-5dc0-400e-9894-7ff0b173e6cb","Type":"ContainerStarted","Data":"ab8dbef72da12a5b0bc2642eab5e45db399a64c8c84927de37e3fc99e2795662"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.847338 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.850482 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" event={"ID":"85dcf72e-1669-4316-afe2-2dbf9059cd35","Type":"ContainerStarted","Data":"f837e8e8147134f5ddb54f4ae4da0d68efe710b02d7f3f757c7211b61a7db3d0"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.850689 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.853199 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" event={"ID":"a0891d87-bed5-4b7b-bab9-653866be0678","Type":"ContainerStarted","Data":"ab3a4a850c0a7fd07c2b011381e382f54e9c2f89ed4eee9eb6b0543df3e2e11b"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.853316 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.860481 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" event={"ID":"9ac71989-25c8-4255-8612-a9f736ab50a1","Type":"ContainerStarted","Data":"b930a20c182c507d3b6f11bbadce8c20d1ff6606c3a6300042ac40879537c223"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.860750 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.862285 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" event={"ID":"310bb10d-00e4-4135-826e-43f7ca17bdf1","Type":"ContainerStarted","Data":"1f5fbb3462beb25c597c0fef7951e775d67756970ba95966bd481a6af574caba"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.862448 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.863845 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" event={"ID":"3e5cf32e-9b90-4518-86bb-5237dbf97e55","Type":"ContainerStarted","Data":"d3e843c2ffc43850048d817293f0ea9d8b59b0e89afd946002fe629921eabf1b"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.863997 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.865934 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" event={"ID":"1984ed7f-dd4a-43b7-b724-a902bccf7448","Type":"ContainerStarted","Data":"2c6b79c6301e3c840ba6e05480944a54d722546ba6eddc5686a01f1568fe1c0e"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.866131 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.868065 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" event={"ID":"84e19238-e730-4bcf-9e09-1c6f3421a04d","Type":"ContainerStarted","Data":"f9d78f7e721399081581da4963fd871778df40777f45c89166087eeb2d101912"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.868235 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.870116 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" podStartSLOduration=1.959167103 podStartE2EDuration="24.87010635s" podCreationTimestamp="2026-03-19 20:20:43 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.39718864 +0000 UTC m=+922.003141712" lastFinishedPulling="2026-03-19 20:21:07.308127857 +0000 UTC m=+944.914080959" observedRunningTime="2026-03-19 20:21:07.865571973 +0000 UTC m=+945.471525045" watchObservedRunningTime="2026-03-19 20:21:07.87010635 +0000 UTC m=+945.476059422" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.870663 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" event={"ID":"5dffef78-a7d7-400c-8e1c-80fd01df4f07","Type":"ContainerStarted","Data":"b06d26413b1b1605150170a73324d3d2c28d5298cc66ad9f00f55699b111e939"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.870762 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.873691 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" event={"ID":"f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b","Type":"ContainerStarted","Data":"76583e3b9848101fc027cb47a1d36849e54c1f818b742ade918e1d6c91447854"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.876486 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" event={"ID":"fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce","Type":"ContainerStarted","Data":"a427ce8c9c63272a1e8a60c576e0470953ce5f62bfa44ef7e37454ec76726a19"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.876776 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.881585 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" event={"ID":"b1950b27-4b38-4bd5-b858-fcf5aa82d7fd","Type":"ContainerStarted","Data":"063904caa4e1c197db5a8905e2094ea20e8a8b5c67754cc8902bbce331f6c7ec"} Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.881750 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.892838 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" podStartSLOduration=2.98842123 podStartE2EDuration="25.892823458s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.393480164 +0000 UTC m=+921.999433236" lastFinishedPulling="2026-03-19 20:21:07.297882382 +0000 UTC m=+944.903835464" observedRunningTime="2026-03-19 20:21:07.891474843 +0000 UTC m=+945.497427915" watchObservedRunningTime="2026-03-19 20:21:07.892823458 +0000 UTC m=+945.498776530" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.924620 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" podStartSLOduration=7.99695397 podStartE2EDuration="25.92460521s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.358823848 +0000 UTC m=+921.964776910" lastFinishedPulling="2026-03-19 20:21:02.286475078 +0000 UTC m=+939.892428150" observedRunningTime="2026-03-19 20:21:07.921814157 +0000 UTC m=+945.527767229" watchObservedRunningTime="2026-03-19 20:21:07.92460521 +0000 UTC m=+945.530558282" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.965242 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" podStartSLOduration=8.007159913 podStartE2EDuration="25.96522488s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.329734735 +0000 UTC m=+921.935687807" lastFinishedPulling="2026-03-19 20:21:02.287799702 +0000 UTC m=+939.893752774" observedRunningTime="2026-03-19 20:21:07.964131652 +0000 UTC m=+945.570084724" watchObservedRunningTime="2026-03-19 20:21:07.96522488 +0000 UTC m=+945.571177952" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.966418 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" podStartSLOduration=7.9705616070000005 podStartE2EDuration="25.966413191s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.291544968 +0000 UTC m=+921.897498040" lastFinishedPulling="2026-03-19 20:21:02.287396552 +0000 UTC m=+939.893349624" observedRunningTime="2026-03-19 20:21:07.946568577 +0000 UTC m=+945.552521649" watchObservedRunningTime="2026-03-19 20:21:07.966413191 +0000 UTC m=+945.572366253" Mar 19 20:21:07 crc kubenswrapper[4799]: I0319 20:21:07.996606 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" podStartSLOduration=3.071747234 podStartE2EDuration="25.996588721s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.382777887 +0000 UTC m=+921.988730959" lastFinishedPulling="2026-03-19 20:21:07.307619354 +0000 UTC m=+944.913572446" observedRunningTime="2026-03-19 20:21:07.992108515 +0000 UTC m=+945.598061587" watchObservedRunningTime="2026-03-19 20:21:07.996588721 +0000 UTC m=+945.602541793" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.055729 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" podStartSLOduration=3.166340761 podStartE2EDuration="26.05571419s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.404766936 +0000 UTC m=+922.010720008" lastFinishedPulling="2026-03-19 20:21:07.294140345 +0000 UTC m=+944.900093437" observedRunningTime="2026-03-19 20:21:08.017992715 +0000 UTC m=+945.623945787" watchObservedRunningTime="2026-03-19 20:21:08.05571419 +0000 UTC m=+945.661667262" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.080103 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" podStartSLOduration=8.117127106 podStartE2EDuration="26.0800822s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.359051723 +0000 UTC m=+921.965004795" lastFinishedPulling="2026-03-19 20:21:02.322006817 +0000 UTC m=+939.927959889" observedRunningTime="2026-03-19 20:21:08.061551271 +0000 UTC m=+945.667504343" watchObservedRunningTime="2026-03-19 20:21:08.0800822 +0000 UTC m=+945.686035272" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.081728 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2scch" podStartSLOduration=7.125985467 podStartE2EDuration="25.081719843s" podCreationTimestamp="2026-03-19 20:20:43 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.359254879 +0000 UTC m=+921.965207961" lastFinishedPulling="2026-03-19 20:21:02.314989265 +0000 UTC m=+939.920942337" observedRunningTime="2026-03-19 20:21:08.07621598 +0000 UTC m=+945.682169052" watchObservedRunningTime="2026-03-19 20:21:08.081719843 +0000 UTC m=+945.687672915" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.155524 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" podStartSLOduration=7.711443355 podStartE2EDuration="26.155507131s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:43.843266444 +0000 UTC m=+921.449219516" lastFinishedPulling="2026-03-19 20:21:02.28733023 +0000 UTC m=+939.893283292" observedRunningTime="2026-03-19 20:21:08.120178157 +0000 UTC m=+945.726131239" watchObservedRunningTime="2026-03-19 20:21:08.155507131 +0000 UTC m=+945.761460203" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.188840 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" podStartSLOduration=3.269376086 podStartE2EDuration="26.188824203s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.385695393 +0000 UTC m=+921.991648465" lastFinishedPulling="2026-03-19 20:21:07.3051435 +0000 UTC m=+944.911096582" observedRunningTime="2026-03-19 20:21:08.160751067 +0000 UTC m=+945.766704149" watchObservedRunningTime="2026-03-19 20:21:08.188824203 +0000 UTC m=+945.794777265" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.190492 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" podStartSLOduration=3.281689564 podStartE2EDuration="26.190485046s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.397040766 +0000 UTC m=+922.002993838" lastFinishedPulling="2026-03-19 20:21:07.305836228 +0000 UTC m=+944.911789320" observedRunningTime="2026-03-19 20:21:08.186071532 +0000 UTC m=+945.792024604" watchObservedRunningTime="2026-03-19 20:21:08.190485046 +0000 UTC m=+945.796438118" Mar 19 20:21:08 crc kubenswrapper[4799]: I0319 20:21:08.209084 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" podStartSLOduration=3.295370918 podStartE2EDuration="26.209065746s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.394327146 +0000 UTC m=+922.000280208" lastFinishedPulling="2026-03-19 20:21:07.308021954 +0000 UTC m=+944.913975036" observedRunningTime="2026-03-19 20:21:08.206254814 +0000 UTC m=+945.812207886" watchObservedRunningTime="2026-03-19 20:21:08.209065746 +0000 UTC m=+945.815018808" Mar 19 20:21:09 crc kubenswrapper[4799]: I0319 20:21:09.919468 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" event={"ID":"771a024f-6f6e-43f1-82cb-076a70663c36","Type":"ContainerStarted","Data":"ec1b19e9ff7902d175c9e707d9d70f92f4000b9831465b5ed967e3d29658f5e7"} Mar 19 20:21:09 crc kubenswrapper[4799]: I0319 20:21:09.920338 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:21:09 crc kubenswrapper[4799]: I0319 20:21:09.940248 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" podStartSLOduration=21.802023613 podStartE2EDuration="27.940233878s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:21:03.293141842 +0000 UTC m=+940.899094924" lastFinishedPulling="2026-03-19 20:21:09.431352107 +0000 UTC m=+947.037305189" observedRunningTime="2026-03-19 20:21:09.938956845 +0000 UTC m=+947.544909957" watchObservedRunningTime="2026-03-19 20:21:09.940233878 +0000 UTC m=+947.546186950" Mar 19 20:21:12 crc kubenswrapper[4799]: I0319 20:21:12.884902 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jphcf" Mar 19 20:21:12 crc kubenswrapper[4799]: I0319 20:21:12.911905 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-xrrtz" Mar 19 20:21:12 crc kubenswrapper[4799]: I0319 20:21:12.946780 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6j7gd" Mar 19 20:21:12 crc kubenswrapper[4799]: I0319 20:21:12.948902 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" event={"ID":"879a3d02-050b-44dc-95ff-f4a010fe4739","Type":"ContainerStarted","Data":"1e890ced2fac7bde2cf23e7adf7b669490e9044da3dfcd07213625bd4fbd0fa8"} Mar 19 20:21:12 crc kubenswrapper[4799]: I0319 20:21:12.949737 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:21:12 crc kubenswrapper[4799]: I0319 20:21:12.993606 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-d5sq6" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.016118 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" podStartSLOduration=3.596405035 podStartE2EDuration="31.016093705s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.377155522 +0000 UTC m=+921.983108604" lastFinishedPulling="2026-03-19 20:21:11.796844192 +0000 UTC m=+949.402797274" observedRunningTime="2026-03-19 20:21:13.011923447 +0000 UTC m=+950.617876539" watchObservedRunningTime="2026-03-19 20:21:13.016093705 +0000 UTC m=+950.622046787" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.032025 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-f7g26" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.054648 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kswht" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.083086 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-rrp47" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.231843 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-rhbks" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.257770 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-j6bcm" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.274698 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-jqm8n" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.309285 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6wvnr" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.343916 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5s2dz" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.569891 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4z6tx" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.684042 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-dhfqp" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.785463 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-vp4rw" Mar 19 20:21:13 crc kubenswrapper[4799]: I0319 20:21:13.912499 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jrtt2" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.018479 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.025530 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8fe20f94-3898-4826-8f94-a97f5d7619d6-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899hpqp2\" (UID: \"8fe20f94-3898-4826-8f94-a97f5d7619d6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.026813 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.328612 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.328979 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.334071 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.334672 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0525132-b508-40b7-a9eb-4773cfde1c32-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-bbbxq\" (UID: \"d0525132-b508-40b7-a9eb-4773cfde1c32\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.598032 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2"] Mar 19 20:21:15 crc kubenswrapper[4799]: W0319 20:21:15.601023 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fe20f94_3898_4826_8f94_a97f5d7619d6.slice/crio-379bdc68ea8852ed6ba393fd6c2afdd76f8688e05065d815f6e37f78bb8f318a WatchSource:0}: Error finding container 379bdc68ea8852ed6ba393fd6c2afdd76f8688e05065d815f6e37f78bb8f318a: Status 404 returned error can't find the container with id 379bdc68ea8852ed6ba393fd6c2afdd76f8688e05065d815f6e37f78bb8f318a Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.615457 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.972192 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" event={"ID":"52430659-ee8f-4143-a0cc-554487c4ee41","Type":"ContainerStarted","Data":"02ac14347009cd438b8a15340c1462830f0b2ee5152ccaa4f5aaed8d7c452ccc"} Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.973305 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" event={"ID":"8fe20f94-3898-4826-8f94-a97f5d7619d6","Type":"ContainerStarted","Data":"379bdc68ea8852ed6ba393fd6c2afdd76f8688e05065d815f6e37f78bb8f318a"} Mar 19 20:21:15 crc kubenswrapper[4799]: I0319 20:21:15.973394 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.000720 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" podStartSLOduration=2.638607693 podStartE2EDuration="34.000702542s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.312844218 +0000 UTC m=+921.918797290" lastFinishedPulling="2026-03-19 20:21:15.674939027 +0000 UTC m=+953.280892139" observedRunningTime="2026-03-19 20:21:15.99716012 +0000 UTC m=+953.603113242" watchObservedRunningTime="2026-03-19 20:21:16.000702542 +0000 UTC m=+953.606655624" Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.073885 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq"] Mar 19 20:21:16 crc kubenswrapper[4799]: W0319 20:21:16.078628 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0525132_b508_40b7_a9eb_4773cfde1c32.slice/crio-e22c0107c8e3e7600e791fe8fb93e396e9415fc28e85ab100759e5dcb7357cda WatchSource:0}: Error finding container e22c0107c8e3e7600e791fe8fb93e396e9415fc28e85ab100759e5dcb7357cda: Status 404 returned error can't find the container with id e22c0107c8e3e7600e791fe8fb93e396e9415fc28e85ab100759e5dcb7357cda Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.988849 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" event={"ID":"d0525132-b508-40b7-a9eb-4773cfde1c32","Type":"ContainerStarted","Data":"31a02e9b2dcb6aab67dcc6c50a2f8ec9b1d5476e869e4484fea46d3d124831d8"} Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.989220 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" event={"ID":"d0525132-b508-40b7-a9eb-4773cfde1c32","Type":"ContainerStarted","Data":"e22c0107c8e3e7600e791fe8fb93e396e9415fc28e85ab100759e5dcb7357cda"} Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.990277 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.992275 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" event={"ID":"3cac2ce5-3a90-4588-88ba-11557915c62c","Type":"ContainerStarted","Data":"09c7b71df00dd08e786ab7ae0f2c981fe22df455cd8ffc206c5a5264f579564c"} Mar 19 20:21:16 crc kubenswrapper[4799]: I0319 20:21:16.992732 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:21:17 crc kubenswrapper[4799]: I0319 20:21:17.021399 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" podStartSLOduration=34.021366708 podStartE2EDuration="34.021366708s" podCreationTimestamp="2026-03-19 20:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:21:17.017143799 +0000 UTC m=+954.623096881" watchObservedRunningTime="2026-03-19 20:21:17.021366708 +0000 UTC m=+954.627319780" Mar 19 20:21:17 crc kubenswrapper[4799]: I0319 20:21:17.043975 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" podStartSLOduration=1.8281188849999999 podStartE2EDuration="34.043949222s" podCreationTimestamp="2026-03-19 20:20:43 +0000 UTC" firstStartedPulling="2026-03-19 20:20:44.362148204 +0000 UTC m=+921.968101276" lastFinishedPulling="2026-03-19 20:21:16.577978541 +0000 UTC m=+954.183931613" observedRunningTime="2026-03-19 20:21:17.036055058 +0000 UTC m=+954.642008130" watchObservedRunningTime="2026-03-19 20:21:17.043949222 +0000 UTC m=+954.649902304" Mar 19 20:21:18 crc kubenswrapper[4799]: I0319 20:21:18.001350 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" event={"ID":"8fe20f94-3898-4826-8f94-a97f5d7619d6","Type":"ContainerStarted","Data":"77b78536bdb4de695916cb542e6a3167fa0ef8a0aa8d9ca68aed1666be7eaa54"} Mar 19 20:21:18 crc kubenswrapper[4799]: I0319 20:21:18.001828 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:21:18 crc kubenswrapper[4799]: I0319 20:21:18.047647 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" podStartSLOduration=34.136306539 podStartE2EDuration="36.047625579s" podCreationTimestamp="2026-03-19 20:20:42 +0000 UTC" firstStartedPulling="2026-03-19 20:21:15.603286694 +0000 UTC m=+953.209239796" lastFinishedPulling="2026-03-19 20:21:17.514605764 +0000 UTC m=+955.120558836" observedRunningTime="2026-03-19 20:21:18.045130005 +0000 UTC m=+955.651083127" watchObservedRunningTime="2026-03-19 20:21:18.047625579 +0000 UTC m=+955.653578651" Mar 19 20:21:19 crc kubenswrapper[4799]: I0319 20:21:19.034748 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-cnhvk" Mar 19 20:21:23 crc kubenswrapper[4799]: I0319 20:21:23.136102 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-ct2xj" Mar 19 20:21:23 crc kubenswrapper[4799]: I0319 20:21:23.484408 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-gjj5c" Mar 19 20:21:23 crc kubenswrapper[4799]: I0319 20:21:23.716319 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4v4kl" Mar 19 20:21:25 crc kubenswrapper[4799]: I0319 20:21:25.036907 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899hpqp2" Mar 19 20:21:25 crc kubenswrapper[4799]: I0319 20:21:25.626501 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-bbbxq" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.590107 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-9sjt5"] Mar 19 20:21:44 crc kubenswrapper[4799]: E0319 20:21:44.591036 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="extract-content" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.591053 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="extract-content" Mar 19 20:21:44 crc kubenswrapper[4799]: E0319 20:21:44.591063 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="registry-server" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.591071 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="registry-server" Mar 19 20:21:44 crc kubenswrapper[4799]: E0319 20:21:44.591086 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="extract-utilities" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.591095 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="extract-utilities" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.591278 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ec3bd0-9b6e-4cd9-bca2-e1e75863a174" containerName="registry-server" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.592799 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.594972 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.595226 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.595414 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.596529 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7ptpz" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.603676 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-9sjt5"] Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.646993 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-sff4q"] Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.648055 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.650340 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.659240 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-sff4q"] Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.706148 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwxbx\" (UniqueName: \"kubernetes.io/projected/d58ed172-1335-4a56-818e-9dcec53add01-kube-api-access-nwxbx\") pod \"dnsmasq-dns-5448ff6dc7-9sjt5\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.706229 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ed172-1335-4a56-818e-9dcec53add01-config\") pod \"dnsmasq-dns-5448ff6dc7-9sjt5\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.807986 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8zpx\" (UniqueName: \"kubernetes.io/projected/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-kube-api-access-s8zpx\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.808068 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwxbx\" (UniqueName: \"kubernetes.io/projected/d58ed172-1335-4a56-818e-9dcec53add01-kube-api-access-nwxbx\") pod \"dnsmasq-dns-5448ff6dc7-9sjt5\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.808211 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-config\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.808305 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-dns-svc\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.808487 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ed172-1335-4a56-818e-9dcec53add01-config\") pod \"dnsmasq-dns-5448ff6dc7-9sjt5\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.809492 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ed172-1335-4a56-818e-9dcec53add01-config\") pod \"dnsmasq-dns-5448ff6dc7-9sjt5\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.828113 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwxbx\" (UniqueName: \"kubernetes.io/projected/d58ed172-1335-4a56-818e-9dcec53add01-kube-api-access-nwxbx\") pod \"dnsmasq-dns-5448ff6dc7-9sjt5\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.909609 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.910104 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-config\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.910200 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-dns-svc\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.910366 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8zpx\" (UniqueName: \"kubernetes.io/projected/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-kube-api-access-s8zpx\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.911018 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-config\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.911052 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-dns-svc\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.927819 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8zpx\" (UniqueName: \"kubernetes.io/projected/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-kube-api-access-s8zpx\") pod \"dnsmasq-dns-64696987c5-sff4q\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:44 crc kubenswrapper[4799]: I0319 20:21:44.962130 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:21:45 crc kubenswrapper[4799]: I0319 20:21:45.348715 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-9sjt5"] Mar 19 20:21:45 crc kubenswrapper[4799]: I0319 20:21:45.413802 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-sff4q"] Mar 19 20:21:46 crc kubenswrapper[4799]: I0319 20:21:46.249253 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" event={"ID":"d58ed172-1335-4a56-818e-9dcec53add01","Type":"ContainerStarted","Data":"c428d800c5bc6e4a08b9f209b340962132d68b63a3be0219364bd1d98b15288a"} Mar 19 20:21:46 crc kubenswrapper[4799]: I0319 20:21:46.250912 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-sff4q" event={"ID":"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217","Type":"ContainerStarted","Data":"683773c1a8a1059b32c09c14b58631af122a837e089260b501a1096c17f06787"} Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.443107 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-9sjt5"] Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.482950 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-zpfvp"] Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.486495 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.491644 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-zpfvp"] Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.665754 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-config\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.665789 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728kn\" (UniqueName: \"kubernetes.io/projected/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-kube-api-access-728kn\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.665829 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.708181 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-sff4q"] Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.728792 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-26g59"] Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.729825 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.748702 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-26g59"] Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.766979 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-config\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.767031 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728kn\" (UniqueName: \"kubernetes.io/projected/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-kube-api-access-728kn\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.767067 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.767948 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.768355 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-config\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.795926 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728kn\" (UniqueName: \"kubernetes.io/projected/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-kube-api-access-728kn\") pod \"dnsmasq-dns-658f55c9f5-zpfvp\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.811918 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.868371 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.868452 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6dd\" (UniqueName: \"kubernetes.io/projected/c7224368-9599-456c-a0f4-002960e5f18c-kube-api-access-sl6dd\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.868534 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-config\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.969442 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6dd\" (UniqueName: \"kubernetes.io/projected/c7224368-9599-456c-a0f4-002960e5f18c-kube-api-access-sl6dd\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.970776 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-config\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.970827 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.971818 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:47 crc kubenswrapper[4799]: I0319 20:21:47.971896 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-config\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.009096 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6dd\" (UniqueName: \"kubernetes.io/projected/c7224368-9599-456c-a0f4-002960e5f18c-kube-api-access-sl6dd\") pod \"dnsmasq-dns-54b5dffb47-26g59\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.045525 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.282940 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-zpfvp"] Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.314799 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-26g59"] Mar 19 20:21:48 crc kubenswrapper[4799]: W0319 20:21:48.319686 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7224368_9599_456c_a0f4_002960e5f18c.slice/crio-8af10146d6bd88ce7cab95ee3f5d5f3225bb3dee290d9e83abd7002c4bb040d5 WatchSource:0}: Error finding container 8af10146d6bd88ce7cab95ee3f5d5f3225bb3dee290d9e83abd7002c4bb040d5: Status 404 returned error can't find the container with id 8af10146d6bd88ce7cab95ee3f5d5f3225bb3dee290d9e83abd7002c4bb040d5 Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.606788 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.608740 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.613309 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.613331 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.613368 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.613377 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.613446 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.617250 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.617527 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v9gjw" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.630196 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.783785 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.783832 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.783878 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ee15a17-4d32-468e-8a57-2a597cebd850-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.783896 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.783944 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.783961 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.784213 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.784316 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.784461 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.784578 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6b2h\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-kube-api-access-h6b2h\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.784627 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ee15a17-4d32-468e-8a57-2a597cebd850-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.885336 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.895232 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.895599 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.895647 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.895739 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.895806 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6b2h\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-kube-api-access-h6b2h\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.895943 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ee15a17-4d32-468e-8a57-2a597cebd850-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.896021 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.896060 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.896204 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ee15a17-4d32-468e-8a57-2a597cebd850-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.896228 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.896403 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.906714 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.907150 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.910495 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.911005 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.915540 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.919193 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.927590 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.927691 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.960158 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.960414 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.960421 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.960455 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ee15a17-4d32-468e-8a57-2a597cebd850-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.960898 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.961023 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.961293 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lfwkh" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.961507 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.961671 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.963228 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.963782 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ee15a17-4d32-468e-8a57-2a597cebd850-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.965929 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6b2h\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-kube-api-access-h6b2h\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:48 crc kubenswrapper[4799]: I0319 20:21:48.974482 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101552 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101606 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101627 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101644 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101680 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/749a043f-5262-416d-b639-9ff8fdcf7f12-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101797 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-config-data\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101887 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101928 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.101985 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/749a043f-5262-416d-b639-9ff8fdcf7f12-pod-info\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.102000 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-server-conf\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.102141 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9cfx\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-kube-api-access-z9cfx\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203487 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9cfx\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-kube-api-access-z9cfx\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203548 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203576 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203590 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203610 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203640 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/749a043f-5262-416d-b639-9ff8fdcf7f12-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203683 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-config-data\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.203948 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.204111 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.204228 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.204632 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.204654 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/749a043f-5262-416d-b639-9ff8fdcf7f12-pod-info\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.204669 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-server-conf\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.205004 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.205727 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-config-data\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.205793 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.206549 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-server-conf\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.210556 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.213203 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.216894 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/749a043f-5262-416d-b639-9ff8fdcf7f12-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.224059 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/749a043f-5262-416d-b639-9ff8fdcf7f12-pod-info\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.230641 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9cfx\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-kube-api-access-z9cfx\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.250848 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.269634 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.300827 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" event={"ID":"c7224368-9599-456c-a0f4-002960e5f18c","Type":"ContainerStarted","Data":"8af10146d6bd88ce7cab95ee3f5d5f3225bb3dee290d9e83abd7002c4bb040d5"} Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.303111 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" event={"ID":"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c","Type":"ContainerStarted","Data":"569aeb9ee175609092c06c737f20617a5fdb92f90f4b8a3923b80776cf472540"} Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.321214 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.944524 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.946768 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.949766 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lck76" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.950212 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.950839 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.950897 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.955385 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 20:21:49 crc kubenswrapper[4799]: I0319 20:21:49.963774 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.117931 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b6d87e-0486-4c0c-9578-514626ca7579-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.117977 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.118000 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.118036 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2zgm\" (UniqueName: \"kubernetes.io/projected/e5b6d87e-0486-4c0c-9578-514626ca7579-kube-api-access-l2zgm\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.118065 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5b6d87e-0486-4c0c-9578-514626ca7579-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.118085 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b6d87e-0486-4c0c-9578-514626ca7579-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.118124 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.118141 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219465 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5b6d87e-0486-4c0c-9578-514626ca7579-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219552 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b6d87e-0486-4c0c-9578-514626ca7579-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219677 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219715 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219804 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b6d87e-0486-4c0c-9578-514626ca7579-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219836 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219859 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.219910 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2zgm\" (UniqueName: \"kubernetes.io/projected/e5b6d87e-0486-4c0c-9578-514626ca7579-kube-api-access-l2zgm\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.220315 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5b6d87e-0486-4c0c-9578-514626ca7579-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.221611 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.221803 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.224497 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.226962 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b6d87e-0486-4c0c-9578-514626ca7579-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.229132 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5b6d87e-0486-4c0c-9578-514626ca7579-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.244194 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b6d87e-0486-4c0c-9578-514626ca7579-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.246666 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2zgm\" (UniqueName: \"kubernetes.io/projected/e5b6d87e-0486-4c0c-9578-514626ca7579-kube-api-access-l2zgm\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.255112 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e5b6d87e-0486-4c0c-9578-514626ca7579\") " pod="openstack/openstack-galera-0" Mar 19 20:21:50 crc kubenswrapper[4799]: I0319 20:21:50.268338 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.380488 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.382623 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.386768 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.387046 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-f9t87" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.387433 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.387632 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.387738 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.539255 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.540926 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5mv\" (UniqueName: \"kubernetes.io/projected/9ad66907-e766-4e25-9e0c-03e2a0a803e6-kube-api-access-mn5mv\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.540961 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad66907-e766-4e25-9e0c-03e2a0a803e6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.540987 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.541011 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.541028 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ad66907-e766-4e25-9e0c-03e2a0a803e6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.541616 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ad66907-e766-4e25-9e0c-03e2a0a803e6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.541648 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.630449 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.633747 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.640146 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643094 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643210 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5mv\" (UniqueName: \"kubernetes.io/projected/9ad66907-e766-4e25-9e0c-03e2a0a803e6-kube-api-access-mn5mv\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643248 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad66907-e766-4e25-9e0c-03e2a0a803e6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643302 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643334 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643354 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ad66907-e766-4e25-9e0c-03e2a0a803e6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643382 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ad66907-e766-4e25-9e0c-03e2a0a803e6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643437 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643491 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.643769 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.645053 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.645315 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ad66907-e766-4e25-9e0c-03e2a0a803e6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.647890 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.648528 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4c5rc" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.646882 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ad66907-e766-4e25-9e0c-03e2a0a803e6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.650366 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ad66907-e766-4e25-9e0c-03e2a0a803e6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.654806 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad66907-e766-4e25-9e0c-03e2a0a803e6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.672907 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.682187 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5mv\" (UniqueName: \"kubernetes.io/projected/9ad66907-e766-4e25-9e0c-03e2a0a803e6-kube-api-access-mn5mv\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.683372 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ad66907-e766-4e25-9e0c-03e2a0a803e6\") " pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.714123 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.745082 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d881c32e-3c0c-415a-aa56-6e70a316b015-config-data\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.745147 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d881c32e-3c0c-415a-aa56-6e70a316b015-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.745171 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d881c32e-3c0c-415a-aa56-6e70a316b015-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.745187 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d881c32e-3c0c-415a-aa56-6e70a316b015-kolla-config\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.745210 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khsxr\" (UniqueName: \"kubernetes.io/projected/d881c32e-3c0c-415a-aa56-6e70a316b015-kube-api-access-khsxr\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.846420 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d881c32e-3c0c-415a-aa56-6e70a316b015-config-data\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.846504 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d881c32e-3c0c-415a-aa56-6e70a316b015-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.846527 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d881c32e-3c0c-415a-aa56-6e70a316b015-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.846541 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d881c32e-3c0c-415a-aa56-6e70a316b015-kolla-config\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.846567 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khsxr\" (UniqueName: \"kubernetes.io/projected/d881c32e-3c0c-415a-aa56-6e70a316b015-kube-api-access-khsxr\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.847913 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d881c32e-3c0c-415a-aa56-6e70a316b015-config-data\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.849420 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d881c32e-3c0c-415a-aa56-6e70a316b015-kolla-config\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.867081 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d881c32e-3c0c-415a-aa56-6e70a316b015-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.869912 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d881c32e-3c0c-415a-aa56-6e70a316b015-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:51 crc kubenswrapper[4799]: I0319 20:21:51.897977 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khsxr\" (UniqueName: \"kubernetes.io/projected/d881c32e-3c0c-415a-aa56-6e70a316b015-kube-api-access-khsxr\") pod \"memcached-0\" (UID: \"d881c32e-3c0c-415a-aa56-6e70a316b015\") " pod="openstack/memcached-0" Mar 19 20:21:52 crc kubenswrapper[4799]: I0319 20:21:52.019586 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 20:21:53 crc kubenswrapper[4799]: I0319 20:21:53.721499 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:21:53 crc kubenswrapper[4799]: I0319 20:21:53.722892 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 20:21:53 crc kubenswrapper[4799]: I0319 20:21:53.729189 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gfnsr" Mar 19 20:21:53 crc kubenswrapper[4799]: I0319 20:21:53.758491 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:21:53 crc kubenswrapper[4799]: I0319 20:21:53.877244 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffhh\" (UniqueName: \"kubernetes.io/projected/1392f44b-030e-4305-aa1e-14d89e1696db-kube-api-access-kffhh\") pod \"kube-state-metrics-0\" (UID: \"1392f44b-030e-4305-aa1e-14d89e1696db\") " pod="openstack/kube-state-metrics-0" Mar 19 20:21:53 crc kubenswrapper[4799]: I0319 20:21:53.979835 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kffhh\" (UniqueName: \"kubernetes.io/projected/1392f44b-030e-4305-aa1e-14d89e1696db-kube-api-access-kffhh\") pod \"kube-state-metrics-0\" (UID: \"1392f44b-030e-4305-aa1e-14d89e1696db\") " pod="openstack/kube-state-metrics-0" Mar 19 20:21:54 crc kubenswrapper[4799]: I0319 20:21:54.001282 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffhh\" (UniqueName: \"kubernetes.io/projected/1392f44b-030e-4305-aa1e-14d89e1696db-kube-api-access-kffhh\") pod \"kube-state-metrics-0\" (UID: \"1392f44b-030e-4305-aa1e-14d89e1696db\") " pod="openstack/kube-state-metrics-0" Mar 19 20:21:54 crc kubenswrapper[4799]: I0319 20:21:54.056071 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.931634 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.933584 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.936754 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.937120 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.936926 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lc7rb" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.937257 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.937484 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 20:21:57 crc kubenswrapper[4799]: I0319 20:21:57.972074 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043335 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043402 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043445 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043482 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043580 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmfnh\" (UniqueName: \"kubernetes.io/projected/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-kube-api-access-gmfnh\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043627 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043848 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-config\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.043975 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145537 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-config\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145595 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145657 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145684 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145717 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145749 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145789 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmfnh\" (UniqueName: \"kubernetes.io/projected/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-kube-api-access-gmfnh\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.145835 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.146261 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.152120 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.153615 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-config\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.156508 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.156759 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.157309 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.160299 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.175605 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmfnh\" (UniqueName: \"kubernetes.io/projected/379ddd3c-6b22-4ccd-90a0-8c1cce4a572b-kube-api-access-gmfnh\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.182814 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b\") " pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.262104 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.318200 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-h89k2"] Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.319358 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.321212 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.321431 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-k2v88" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.323753 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.328979 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dcj2n"] Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.331362 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.337412 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h89k2"] Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.343359 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dcj2n"] Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.449814 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-ovn-controller-tls-certs\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.449871 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/279eab1e-b756-4a2c-be19-2b16d87d645c-scripts\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.449912 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-etc-ovs\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.449947 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-combined-ca-bundle\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450055 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trkb8\" (UniqueName: \"kubernetes.io/projected/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-kube-api-access-trkb8\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450119 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-lib\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450173 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-scripts\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450195 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-run\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450245 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-run\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450317 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-log\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450408 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvss\" (UniqueName: \"kubernetes.io/projected/279eab1e-b756-4a2c-be19-2b16d87d645c-kube-api-access-dfvss\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450476 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-run-ovn\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.450619 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-log-ovn\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552532 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-log-ovn\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552600 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-ovn-controller-tls-certs\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552628 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/279eab1e-b756-4a2c-be19-2b16d87d645c-scripts\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552655 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-etc-ovs\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552676 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-combined-ca-bundle\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552694 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trkb8\" (UniqueName: \"kubernetes.io/projected/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-kube-api-access-trkb8\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552715 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-lib\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552740 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-scripts\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552755 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-run\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552775 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-run\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552798 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-log\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552834 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvss\" (UniqueName: \"kubernetes.io/projected/279eab1e-b756-4a2c-be19-2b16d87d645c-kube-api-access-dfvss\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.552861 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-run-ovn\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.553209 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-log-ovn\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.553253 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-run-ovn\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.553438 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-lib\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.553802 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-etc-ovs\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.553990 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-var-run\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.554012 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-run\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.554406 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/279eab1e-b756-4a2c-be19-2b16d87d645c-var-log\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.555879 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-scripts\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.555920 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/279eab1e-b756-4a2c-be19-2b16d87d645c-scripts\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.558736 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-combined-ca-bundle\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.563059 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-ovn-controller-tls-certs\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.577759 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvss\" (UniqueName: \"kubernetes.io/projected/279eab1e-b756-4a2c-be19-2b16d87d645c-kube-api-access-dfvss\") pod \"ovn-controller-ovs-dcj2n\" (UID: \"279eab1e-b756-4a2c-be19-2b16d87d645c\") " pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.589239 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trkb8\" (UniqueName: \"kubernetes.io/projected/cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8-kube-api-access-trkb8\") pod \"ovn-controller-h89k2\" (UID: \"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8\") " pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.634211 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2" Mar 19 20:21:58 crc kubenswrapper[4799]: I0319 20:21:58.647983 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.133484 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565862-hvpbt"] Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.134942 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.137941 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.138179 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.138254 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.163497 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-hvpbt"] Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.180153 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55qk\" (UniqueName: \"kubernetes.io/projected/ed0e3ea0-f539-4049-ab71-127f44c0d997-kube-api-access-k55qk\") pod \"auto-csr-approver-29565862-hvpbt\" (UID: \"ed0e3ea0-f539-4049-ab71-127f44c0d997\") " pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.283041 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55qk\" (UniqueName: \"kubernetes.io/projected/ed0e3ea0-f539-4049-ab71-127f44c0d997-kube-api-access-k55qk\") pod \"auto-csr-approver-29565862-hvpbt\" (UID: \"ed0e3ea0-f539-4049-ab71-127f44c0d997\") " pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.311933 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55qk\" (UniqueName: \"kubernetes.io/projected/ed0e3ea0-f539-4049-ab71-127f44c0d997-kube-api-access-k55qk\") pod \"auto-csr-approver-29565862-hvpbt\" (UID: \"ed0e3ea0-f539-4049-ab71-127f44c0d997\") " pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.460462 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.724668 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.726736 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.728570 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.729603 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.732782 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.736751 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-r544f" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.746454 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792420 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3d93760-46c9-46e0-aff9-38ad08bad16b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792490 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792553 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d93760-46c9-46e0-aff9-38ad08bad16b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792609 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4kg\" (UniqueName: \"kubernetes.io/projected/a3d93760-46c9-46e0-aff9-38ad08bad16b-kube-api-access-qr4kg\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792693 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792739 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.792786 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.793004 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d93760-46c9-46e0-aff9-38ad08bad16b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.895364 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d93760-46c9-46e0-aff9-38ad08bad16b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.895657 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3d93760-46c9-46e0-aff9-38ad08bad16b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.896349 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3d93760-46c9-46e0-aff9-38ad08bad16b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.896872 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.896980 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d93760-46c9-46e0-aff9-38ad08bad16b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.905258 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.905429 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d93760-46c9-46e0-aff9-38ad08bad16b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.905529 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4kg\" (UniqueName: \"kubernetes.io/projected/a3d93760-46c9-46e0-aff9-38ad08bad16b-kube-api-access-qr4kg\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.905623 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.905671 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.905724 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.907249 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3d93760-46c9-46e0-aff9-38ad08bad16b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.913445 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.913981 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.915274 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d93760-46c9-46e0-aff9-38ad08bad16b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.941831 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:00 crc kubenswrapper[4799]: I0319 20:22:00.947040 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4kg\" (UniqueName: \"kubernetes.io/projected/a3d93760-46c9-46e0-aff9-38ad08bad16b-kube-api-access-qr4kg\") pod \"ovsdbserver-nb-0\" (UID: \"a3d93760-46c9-46e0-aff9-38ad08bad16b\") " pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:01 crc kubenswrapper[4799]: I0319 20:22:01.056835 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.181594 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.182150 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s8zpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-sff4q_openstack(2d1cb6bd-c00e-4cb9-990d-809e7aeaa217): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.183441 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-sff4q" podUID="2d1cb6bd-c00e-4cb9-990d-809e7aeaa217" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.298881 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.299016 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sl6dd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-26g59_openstack(c7224368-9599-456c-a0f4-002960e5f18c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.299035 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.299108 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwxbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-9sjt5_openstack(d58ed172-1335-4a56-818e-9dcec53add01): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.300310 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" podUID="c7224368-9599-456c-a0f4-002960e5f18c" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.300502 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" podUID="d58ed172-1335-4a56-818e-9dcec53add01" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.316443 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.316583 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-728kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-zpfvp_openstack(ca0f5c48-f13c-4bc5-a79e-137f7a039e8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.317871 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.484089 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" Mar 19 20:22:07 crc kubenswrapper[4799]: E0319 20:22:07.489090 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" podUID="c7224368-9599-456c-a0f4-002960e5f18c" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.244620 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.284275 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.296625 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.297338 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.319751 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.329218 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.336066 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-hvpbt"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.341664 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h89k2"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.349502 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.365229 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: W0319 20:22:08.375775 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded0e3ea0_f539_4049_ab71_127f44c0d997.slice/crio-571a132d20136ea07dd6244b8fb28ac221a3dc08fa8d4f25378c3da6122c4332 WatchSource:0}: Error finding container 571a132d20136ea07dd6244b8fb28ac221a3dc08fa8d4f25378c3da6122c4332: Status 404 returned error can't find the container with id 571a132d20136ea07dd6244b8fb28ac221a3dc08fa8d4f25378c3da6122c4332 Mar 19 20:22:08 crc kubenswrapper[4799]: W0319 20:22:08.379326 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5b6d87e_0486_4c0c_9578_514626ca7579.slice/crio-a7978625c64d69925067dabe81d52d7169322fde15fd42ec2341592f20e0865e WatchSource:0}: Error finding container a7978625c64d69925067dabe81d52d7169322fde15fd42ec2341592f20e0865e: Status 404 returned error can't find the container with id a7978625c64d69925067dabe81d52d7169322fde15fd42ec2341592f20e0865e Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.398915 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-dns-svc\") pod \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.399022 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-config\") pod \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.399072 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ed172-1335-4a56-818e-9dcec53add01-config\") pod \"d58ed172-1335-4a56-818e-9dcec53add01\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.399098 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8zpx\" (UniqueName: \"kubernetes.io/projected/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-kube-api-access-s8zpx\") pod \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\" (UID: \"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217\") " Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.399138 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwxbx\" (UniqueName: \"kubernetes.io/projected/d58ed172-1335-4a56-818e-9dcec53add01-kube-api-access-nwxbx\") pod \"d58ed172-1335-4a56-818e-9dcec53add01\" (UID: \"d58ed172-1335-4a56-818e-9dcec53add01\") " Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.400648 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d1cb6bd-c00e-4cb9-990d-809e7aeaa217" (UID: "2d1cb6bd-c00e-4cb9-990d-809e7aeaa217"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.400752 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-config" (OuterVolumeSpecName: "config") pod "2d1cb6bd-c00e-4cb9-990d-809e7aeaa217" (UID: "2d1cb6bd-c00e-4cb9-990d-809e7aeaa217"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.400985 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58ed172-1335-4a56-818e-9dcec53add01-config" (OuterVolumeSpecName: "config") pod "d58ed172-1335-4a56-818e-9dcec53add01" (UID: "d58ed172-1335-4a56-818e-9dcec53add01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.404902 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58ed172-1335-4a56-818e-9dcec53add01-kube-api-access-nwxbx" (OuterVolumeSpecName: "kube-api-access-nwxbx") pod "d58ed172-1335-4a56-818e-9dcec53add01" (UID: "d58ed172-1335-4a56-818e-9dcec53add01"). InnerVolumeSpecName "kube-api-access-nwxbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.405422 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-kube-api-access-s8zpx" (OuterVolumeSpecName: "kube-api-access-s8zpx") pod "2d1cb6bd-c00e-4cb9-990d-809e7aeaa217" (UID: "2d1cb6bd-c00e-4cb9-990d-809e7aeaa217"). InnerVolumeSpecName "kube-api-access-s8zpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.413255 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.485294 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ee15a17-4d32-468e-8a57-2a597cebd850","Type":"ContainerStarted","Data":"3cbc7a6aeda24d6a8b2a9246dcce136892709025fa19de5ce5622db489182542"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.486409 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b","Type":"ContainerStarted","Data":"f80a4adfaea9510a69db5fcfe22d0cb31b790ff822bff6084359bc7ce322ed88"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.487451 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d881c32e-3c0c-415a-aa56-6e70a316b015","Type":"ContainerStarted","Data":"1209a1ff29e103ff9d211c639ed315100ecb38d369a0e6d8cc0fdddeebb88c2d"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.489029 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-sff4q" event={"ID":"2d1cb6bd-c00e-4cb9-990d-809e7aeaa217","Type":"ContainerDied","Data":"683773c1a8a1059b32c09c14b58631af122a837e089260b501a1096c17f06787"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.489048 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-sff4q" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.499683 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"749a043f-5262-416d-b639-9ff8fdcf7f12","Type":"ContainerStarted","Data":"e874606cd843e1968c2986c7c98cecacf31032b9c2b1a34185f32ca0cfb6edf7"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.500849 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.500865 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ed172-1335-4a56-818e-9dcec53add01-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.500883 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8zpx\" (UniqueName: \"kubernetes.io/projected/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-kube-api-access-s8zpx\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.500894 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwxbx\" (UniqueName: \"kubernetes.io/projected/d58ed172-1335-4a56-818e-9dcec53add01-kube-api-access-nwxbx\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.500903 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.510661 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2" event={"ID":"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8","Type":"ContainerStarted","Data":"69c69f2ec506b2883a2a07f5b20887bcdcc987783246e7e383722b1344c91c83"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.514725 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5b6d87e-0486-4c0c-9578-514626ca7579","Type":"ContainerStarted","Data":"a7978625c64d69925067dabe81d52d7169322fde15fd42ec2341592f20e0865e"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.516373 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ad66907-e766-4e25-9e0c-03e2a0a803e6","Type":"ContainerStarted","Data":"55148d2e5af84f4fed4e182fd624d77828095e8f399dbf01e7e2e425c6283676"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.517796 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1392f44b-030e-4305-aa1e-14d89e1696db","Type":"ContainerStarted","Data":"61a4a40c5f8af6942fd47a7520e6d578dc12eab153dea85830188e0ea496748b"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.518775 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" event={"ID":"ed0e3ea0-f539-4049-ab71-127f44c0d997","Type":"ContainerStarted","Data":"571a132d20136ea07dd6244b8fb28ac221a3dc08fa8d4f25378c3da6122c4332"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.520084 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" event={"ID":"d58ed172-1335-4a56-818e-9dcec53add01","Type":"ContainerDied","Data":"c428d800c5bc6e4a08b9f209b340962132d68b63a3be0219364bd1d98b15288a"} Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.520160 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-9sjt5" Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.595986 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-9sjt5"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.610139 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-9sjt5"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.620352 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-sff4q"] Mar 19 20:22:08 crc kubenswrapper[4799]: I0319 20:22:08.632256 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-sff4q"] Mar 19 20:22:09 crc kubenswrapper[4799]: I0319 20:22:09.043329 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 20:22:09 crc kubenswrapper[4799]: I0319 20:22:09.141167 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1cb6bd-c00e-4cb9-990d-809e7aeaa217" path="/var/lib/kubelet/pods/2d1cb6bd-c00e-4cb9-990d-809e7aeaa217/volumes" Mar 19 20:22:09 crc kubenswrapper[4799]: I0319 20:22:09.141699 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58ed172-1335-4a56-818e-9dcec53add01" path="/var/lib/kubelet/pods/d58ed172-1335-4a56-818e-9dcec53add01/volumes" Mar 19 20:22:09 crc kubenswrapper[4799]: I0319 20:22:09.528121 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a3d93760-46c9-46e0-aff9-38ad08bad16b","Type":"ContainerStarted","Data":"638bf68b539960f122b55d7a8aa0723dfaf75bc4a8d79fc26845af38734ab411"} Mar 19 20:22:10 crc kubenswrapper[4799]: I0319 20:22:10.104967 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dcj2n"] Mar 19 20:22:11 crc kubenswrapper[4799]: I0319 20:22:11.552600 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dcj2n" event={"ID":"279eab1e-b756-4a2c-be19-2b16d87d645c","Type":"ContainerStarted","Data":"e4ccaba84d941ac8e4a01e1da295efd1565d349dc6be4e17bff1cb2c87d664fa"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.610049 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dcj2n" event={"ID":"279eab1e-b756-4a2c-be19-2b16d87d645c","Type":"ContainerStarted","Data":"4a6a287ba6938794d407d714e0b7baf41cd55a17d7df8b6c27f9778174be85d2"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.615297 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a3d93760-46c9-46e0-aff9-38ad08bad16b","Type":"ContainerStarted","Data":"ca066163c0c9432bb5edd114c8bda0138b2f75a889d9f19576de4efcbb17c14d"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.616455 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b","Type":"ContainerStarted","Data":"593f1734bb5b305ce82591e82abb6ffbb1477e093086346b06581bdcc863124e"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.617625 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5b6d87e-0486-4c0c-9578-514626ca7579","Type":"ContainerStarted","Data":"087567bf74b509501624ba4ac56dba2dcd0c967f4914d8c67cf721b6c2b9762d"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.622432 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ad66907-e766-4e25-9e0c-03e2a0a803e6","Type":"ContainerStarted","Data":"3f9ad4f9a4ac6388cec4eec34c9db84239dc86d30512ee6391c8ec293b7a4529"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.624344 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1392f44b-030e-4305-aa1e-14d89e1696db","Type":"ContainerStarted","Data":"8c00e5f00da3f3438e958825417727801c2e375dca1b9bbb65ae4b86343917f7"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.624640 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.626315 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d881c32e-3c0c-415a-aa56-6e70a316b015","Type":"ContainerStarted","Data":"1bb4534919bd4e2c2192775011352806103b68650cbb274bf657845b645ae225"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.627116 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.630749 4799 generic.go:334] "Generic (PLEG): container finished" podID="ed0e3ea0-f539-4049-ab71-127f44c0d997" containerID="ef0e2f99cd80bd9cf55165de8414ab0f5e5d043313cad0c04973c3aa57a69653" exitCode=0 Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.630787 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" event={"ID":"ed0e3ea0-f539-4049-ab71-127f44c0d997","Type":"ContainerDied","Data":"ef0e2f99cd80bd9cf55165de8414ab0f5e5d043313cad0c04973c3aa57a69653"} Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.664308 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.186350177 podStartE2EDuration="24.664290622s" podCreationTimestamp="2026-03-19 20:21:53 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.382063517 +0000 UTC m=+1005.988016589" lastFinishedPulling="2026-03-19 20:22:16.860003922 +0000 UTC m=+1014.465957034" observedRunningTime="2026-03-19 20:22:17.656458757 +0000 UTC m=+1015.262411839" watchObservedRunningTime="2026-03-19 20:22:17.664290622 +0000 UTC m=+1015.270243694" Mar 19 20:22:17 crc kubenswrapper[4799]: I0319 20:22:17.717730 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.345823014 podStartE2EDuration="26.717706418s" podCreationTimestamp="2026-03-19 20:21:51 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.356658631 +0000 UTC m=+1005.962611703" lastFinishedPulling="2026-03-19 20:22:16.728541995 +0000 UTC m=+1014.334495107" observedRunningTime="2026-03-19 20:22:17.716099784 +0000 UTC m=+1015.322052866" watchObservedRunningTime="2026-03-19 20:22:17.717706418 +0000 UTC m=+1015.323659490" Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.643482 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2" event={"ID":"cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8","Type":"ContainerStarted","Data":"e2d155eb7abf8170d0af6176aa33978d8f799305d5b1cc5000d6f0831978f5d6"} Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.643875 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-h89k2" Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.645354 4799 generic.go:334] "Generic (PLEG): container finished" podID="279eab1e-b756-4a2c-be19-2b16d87d645c" containerID="4a6a287ba6938794d407d714e0b7baf41cd55a17d7df8b6c27f9778174be85d2" exitCode=0 Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.645575 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dcj2n" event={"ID":"279eab1e-b756-4a2c-be19-2b16d87d645c","Type":"ContainerDied","Data":"4a6a287ba6938794d407d714e0b7baf41cd55a17d7df8b6c27f9778174be85d2"} Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.647502 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"749a043f-5262-416d-b639-9ff8fdcf7f12","Type":"ContainerStarted","Data":"d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074"} Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.649564 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ee15a17-4d32-468e-8a57-2a597cebd850","Type":"ContainerStarted","Data":"37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e"} Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.677648 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-h89k2" podStartSLOduration=12.1806007 podStartE2EDuration="20.677631118s" podCreationTimestamp="2026-03-19 20:21:58 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.333624879 +0000 UTC m=+1005.939577951" lastFinishedPulling="2026-03-19 20:22:16.830655287 +0000 UTC m=+1014.436608369" observedRunningTime="2026-03-19 20:22:18.665365331 +0000 UTC m=+1016.271318413" watchObservedRunningTime="2026-03-19 20:22:18.677631118 +0000 UTC m=+1016.283584190" Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.962886 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.969474 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k55qk\" (UniqueName: \"kubernetes.io/projected/ed0e3ea0-f539-4049-ab71-127f44c0d997-kube-api-access-k55qk\") pod \"ed0e3ea0-f539-4049-ab71-127f44c0d997\" (UID: \"ed0e3ea0-f539-4049-ab71-127f44c0d997\") " Mar 19 20:22:18 crc kubenswrapper[4799]: I0319 20:22:18.974276 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0e3ea0-f539-4049-ab71-127f44c0d997-kube-api-access-k55qk" (OuterVolumeSpecName: "kube-api-access-k55qk") pod "ed0e3ea0-f539-4049-ab71-127f44c0d997" (UID: "ed0e3ea0-f539-4049-ab71-127f44c0d997"). InnerVolumeSpecName "kube-api-access-k55qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.071908 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k55qk\" (UniqueName: \"kubernetes.io/projected/ed0e3ea0-f539-4049-ab71-127f44c0d997-kube-api-access-k55qk\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.659788 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" event={"ID":"ed0e3ea0-f539-4049-ab71-127f44c0d997","Type":"ContainerDied","Data":"571a132d20136ea07dd6244b8fb28ac221a3dc08fa8d4f25378c3da6122c4332"} Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.660012 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="571a132d20136ea07dd6244b8fb28ac221a3dc08fa8d4f25378c3da6122c4332" Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.660061 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-hvpbt" Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.667326 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dcj2n" event={"ID":"279eab1e-b756-4a2c-be19-2b16d87d645c","Type":"ContainerStarted","Data":"b2f62a236b0d4f0e6aca1fa9b48176809fcb991e18650d3d9fbab63af7560254"} Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.667450 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dcj2n" event={"ID":"279eab1e-b756-4a2c-be19-2b16d87d645c","Type":"ContainerStarted","Data":"4c20e199f70031872002fe396bf5cb3160dbd65485484335017db8f3076b09b0"} Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.667487 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.667511 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:22:19 crc kubenswrapper[4799]: I0319 20:22:19.994655 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dcj2n" podStartSLOduration=16.141897927 podStartE2EDuration="21.994633786s" podCreationTimestamp="2026-03-19 20:21:58 +0000 UTC" firstStartedPulling="2026-03-19 20:22:10.956363506 +0000 UTC m=+1008.562316578" lastFinishedPulling="2026-03-19 20:22:16.809099325 +0000 UTC m=+1014.415052437" observedRunningTime="2026-03-19 20:22:19.692449074 +0000 UTC m=+1017.298402146" watchObservedRunningTime="2026-03-19 20:22:19.994633786 +0000 UTC m=+1017.600586868" Mar 19 20:22:20 crc kubenswrapper[4799]: I0319 20:22:20.031854 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-5p4qg"] Mar 19 20:22:20 crc kubenswrapper[4799]: I0319 20:22:20.036644 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-5p4qg"] Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.155529 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9f86d5-d034-4353-bdb0-67c42ee7d2e0" path="/var/lib/kubelet/pods/cc9f86d5-d034-4353-bdb0-67c42ee7d2e0/volumes" Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.690036 4799 generic.go:334] "Generic (PLEG): container finished" podID="e5b6d87e-0486-4c0c-9578-514626ca7579" containerID="087567bf74b509501624ba4ac56dba2dcd0c967f4914d8c67cf721b6c2b9762d" exitCode=0 Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.690122 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5b6d87e-0486-4c0c-9578-514626ca7579","Type":"ContainerDied","Data":"087567bf74b509501624ba4ac56dba2dcd0c967f4914d8c67cf721b6c2b9762d"} Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.694256 4799 generic.go:334] "Generic (PLEG): container finished" podID="9ad66907-e766-4e25-9e0c-03e2a0a803e6" containerID="3f9ad4f9a4ac6388cec4eec34c9db84239dc86d30512ee6391c8ec293b7a4529" exitCode=0 Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.694342 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ad66907-e766-4e25-9e0c-03e2a0a803e6","Type":"ContainerDied","Data":"3f9ad4f9a4ac6388cec4eec34c9db84239dc86d30512ee6391c8ec293b7a4529"} Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.697429 4799 generic.go:334] "Generic (PLEG): container finished" podID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerID="d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee" exitCode=0 Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.697526 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" event={"ID":"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c","Type":"ContainerDied","Data":"d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee"} Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.699661 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a3d93760-46c9-46e0-aff9-38ad08bad16b","Type":"ContainerStarted","Data":"526bc927edd6a72a45fab5589beaa804ca6cbfff7a08896e4a2ada3a8a7f62eb"} Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.707282 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"379ddd3c-6b22-4ccd-90a0-8c1cce4a572b","Type":"ContainerStarted","Data":"186e86430d43dcf882aac23871d5824d0f78908ddd4a77491cb3c45ffc097183"} Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.743084 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.181913775 podStartE2EDuration="25.743064023s" podCreationTimestamp="2026-03-19 20:21:56 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.419672799 +0000 UTC m=+1006.025625871" lastFinishedPulling="2026-03-19 20:22:20.980823047 +0000 UTC m=+1018.586776119" observedRunningTime="2026-03-19 20:22:21.741432908 +0000 UTC m=+1019.347385990" watchObservedRunningTime="2026-03-19 20:22:21.743064023 +0000 UTC m=+1019.349017105" Mar 19 20:22:21 crc kubenswrapper[4799]: I0319 20:22:21.830482 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.917105679 podStartE2EDuration="22.830465291s" podCreationTimestamp="2026-03-19 20:21:59 +0000 UTC" firstStartedPulling="2026-03-19 20:22:09.0548975 +0000 UTC m=+1006.660850582" lastFinishedPulling="2026-03-19 20:22:20.968257122 +0000 UTC m=+1018.574210194" observedRunningTime="2026-03-19 20:22:21.818866823 +0000 UTC m=+1019.424819895" watchObservedRunningTime="2026-03-19 20:22:21.830465291 +0000 UTC m=+1019.436418363" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.023037 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.057834 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.096300 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.263326 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.294492 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.716816 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" event={"ID":"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c","Type":"ContainerStarted","Data":"abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db"} Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.717126 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.719589 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5b6d87e-0486-4c0c-9578-514626ca7579","Type":"ContainerStarted","Data":"0796eb1d47b50a5a986bc191e4c524a9eec60924b728d243a0af44174b851366"} Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.722461 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ad66907-e766-4e25-9e0c-03e2a0a803e6","Type":"ContainerStarted","Data":"12b5877df27652ab1a5b625ca2d3f3d88eee82b4360b21ac3ad37446fdcd8cfd"} Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.725761 4799 generic.go:334] "Generic (PLEG): container finished" podID="c7224368-9599-456c-a0f4-002960e5f18c" containerID="d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3" exitCode=0 Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.725847 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" event={"ID":"c7224368-9599-456c-a0f4-002960e5f18c","Type":"ContainerDied","Data":"d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3"} Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.726505 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.726539 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.741220 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" podStartSLOduration=2.971059453 podStartE2EDuration="35.741197761s" podCreationTimestamp="2026-03-19 20:21:47 +0000 UTC" firstStartedPulling="2026-03-19 20:21:48.299529017 +0000 UTC m=+985.905482089" lastFinishedPulling="2026-03-19 20:22:21.069667325 +0000 UTC m=+1018.675620397" observedRunningTime="2026-03-19 20:22:22.734159428 +0000 UTC m=+1020.340112550" watchObservedRunningTime="2026-03-19 20:22:22.741197761 +0000 UTC m=+1020.347150853" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.762727 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.214111288 podStartE2EDuration="32.762706361s" podCreationTimestamp="2026-03-19 20:21:50 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.322660708 +0000 UTC m=+1005.928613780" lastFinishedPulling="2026-03-19 20:22:16.871255781 +0000 UTC m=+1014.477208853" observedRunningTime="2026-03-19 20:22:22.76194471 +0000 UTC m=+1020.367897782" watchObservedRunningTime="2026-03-19 20:22:22.762706361 +0000 UTC m=+1020.368659453" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.782937 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.794734 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 20:22:22 crc kubenswrapper[4799]: I0319 20:22:22.820971 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.376276348 podStartE2EDuration="34.820950399s" podCreationTimestamp="2026-03-19 20:21:48 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.385645236 +0000 UTC m=+1005.991598308" lastFinishedPulling="2026-03-19 20:22:16.830319257 +0000 UTC m=+1014.436272359" observedRunningTime="2026-03-19 20:22:22.816215349 +0000 UTC m=+1020.422168441" watchObservedRunningTime="2026-03-19 20:22:22.820950399 +0000 UTC m=+1020.426903471" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.075141 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-26g59"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.099350 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-nz4f8"] Mar 19 20:22:23 crc kubenswrapper[4799]: E0319 20:22:23.099645 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0e3ea0-f539-4049-ab71-127f44c0d997" containerName="oc" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.099662 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0e3ea0-f539-4049-ab71-127f44c0d997" containerName="oc" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.099815 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0e3ea0-f539-4049-ab71-127f44c0d997" containerName="oc" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.100548 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.103894 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.142110 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-nz4f8"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.161916 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.162052 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-config\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.162100 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.162175 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58xmp\" (UniqueName: \"kubernetes.io/projected/0054f536-14f7-4796-b4a0-bbf9cb1bebec-kube-api-access-58xmp\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.236357 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-q7s8k"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.238499 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.241797 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.242094 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q7s8k"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.266231 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.266341 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-config\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.266391 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.266447 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58xmp\" (UniqueName: \"kubernetes.io/projected/0054f536-14f7-4796-b4a0-bbf9cb1bebec-kube-api-access-58xmp\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.269095 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-config\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.269106 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.269705 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.275668 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-zpfvp"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.292561 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58xmp\" (UniqueName: \"kubernetes.io/projected/0054f536-14f7-4796-b4a0-bbf9cb1bebec-kube-api-access-58xmp\") pod \"dnsmasq-dns-84d7bcdf99-nz4f8\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.305409 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-tjh5n"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.307934 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.315614 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.335302 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-tjh5n"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.353155 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.354432 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.359111 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.359261 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.359448 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.359548 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-28ztf" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369497 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5q8\" (UniqueName: \"kubernetes.io/projected/ada5da2f-892a-4d4b-a18c-d641456e9124-kube-api-access-bc5q8\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369569 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-dns-svc\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369611 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ada5da2f-892a-4d4b-a18c-d641456e9124-ovn-rundir\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369645 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgxjv\" (UniqueName: \"kubernetes.io/projected/0d76bbee-3d03-4df7-bdea-8aa71418225f-kube-api-access-fgxjv\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369670 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-config\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369692 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ada5da2f-892a-4d4b-a18c-d641456e9124-ovs-rundir\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369728 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369758 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369795 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada5da2f-892a-4d4b-a18c-d641456e9124-combined-ca-bundle\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369821 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada5da2f-892a-4d4b-a18c-d641456e9124-config\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.369842 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada5da2f-892a-4d4b-a18c-d641456e9124-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.370919 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.417182 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471507 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5q8\" (UniqueName: \"kubernetes.io/projected/ada5da2f-892a-4d4b-a18c-d641456e9124-kube-api-access-bc5q8\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471549 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471567 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/df45d4ef-7350-42d7-a2d7-cede9b13ff55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471583 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45d4ef-7350-42d7-a2d7-cede9b13ff55-config\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471625 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-dns-svc\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471647 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ada5da2f-892a-4d4b-a18c-d641456e9124-ovn-rundir\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471667 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgxjv\" (UniqueName: \"kubernetes.io/projected/0d76bbee-3d03-4df7-bdea-8aa71418225f-kube-api-access-fgxjv\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471693 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-config\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471716 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ada5da2f-892a-4d4b-a18c-d641456e9124-ovs-rundir\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471733 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df45d4ef-7350-42d7-a2d7-cede9b13ff55-scripts\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471759 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471783 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471806 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471824 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada5da2f-892a-4d4b-a18c-d641456e9124-combined-ca-bundle\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471846 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada5da2f-892a-4d4b-a18c-d641456e9124-config\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471867 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada5da2f-892a-4d4b-a18c-d641456e9124-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471886 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssw2q\" (UniqueName: \"kubernetes.io/projected/df45d4ef-7350-42d7-a2d7-cede9b13ff55-kube-api-access-ssw2q\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.471921 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.472681 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-config\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.472767 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.472899 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ada5da2f-892a-4d4b-a18c-d641456e9124-ovn-rundir\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.472941 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ada5da2f-892a-4d4b-a18c-d641456e9124-ovs-rundir\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.473106 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-dns-svc\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.473586 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.473688 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada5da2f-892a-4d4b-a18c-d641456e9124-config\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.477089 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada5da2f-892a-4d4b-a18c-d641456e9124-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.487017 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada5da2f-892a-4d4b-a18c-d641456e9124-combined-ca-bundle\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.491136 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5q8\" (UniqueName: \"kubernetes.io/projected/ada5da2f-892a-4d4b-a18c-d641456e9124-kube-api-access-bc5q8\") pod \"ovn-controller-metrics-q7s8k\" (UID: \"ada5da2f-892a-4d4b-a18c-d641456e9124\") " pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.497327 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgxjv\" (UniqueName: \"kubernetes.io/projected/0d76bbee-3d03-4df7-bdea-8aa71418225f-kube-api-access-fgxjv\") pod \"dnsmasq-dns-f697c8bff-tjh5n\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.567211 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q7s8k" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573774 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573832 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573851 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/df45d4ef-7350-42d7-a2d7-cede9b13ff55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573870 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45d4ef-7350-42d7-a2d7-cede9b13ff55-config\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573923 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df45d4ef-7350-42d7-a2d7-cede9b13ff55-scripts\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573955 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.573996 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssw2q\" (UniqueName: \"kubernetes.io/projected/df45d4ef-7350-42d7-a2d7-cede9b13ff55-kube-api-access-ssw2q\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.574372 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/df45d4ef-7350-42d7-a2d7-cede9b13ff55-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.575312 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df45d4ef-7350-42d7-a2d7-cede9b13ff55-scripts\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.575917 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45d4ef-7350-42d7-a2d7-cede9b13ff55-config\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.579236 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.584002 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.584245 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/df45d4ef-7350-42d7-a2d7-cede9b13ff55-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.591377 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssw2q\" (UniqueName: \"kubernetes.io/projected/df45d4ef-7350-42d7-a2d7-cede9b13ff55-kube-api-access-ssw2q\") pod \"ovn-northd-0\" (UID: \"df45d4ef-7350-42d7-a2d7-cede9b13ff55\") " pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.653806 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.672763 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.738133 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" event={"ID":"c7224368-9599-456c-a0f4-002960e5f18c","Type":"ContainerStarted","Data":"6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d"} Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.738324 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" podUID="c7224368-9599-456c-a0f4-002960e5f18c" containerName="dnsmasq-dns" containerID="cri-o://6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d" gracePeriod=10 Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.738475 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.773097 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" podStartSLOduration=-9223372000.0817 podStartE2EDuration="36.773077236s" podCreationTimestamp="2026-03-19 20:21:47 +0000 UTC" firstStartedPulling="2026-03-19 20:21:48.321931452 +0000 UTC m=+985.927884524" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:22:23.76922621 +0000 UTC m=+1021.375179282" watchObservedRunningTime="2026-03-19 20:22:23.773077236 +0000 UTC m=+1021.379030308" Mar 19 20:22:23 crc kubenswrapper[4799]: I0319 20:22:23.859640 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-nz4f8"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.052931 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q7s8k"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.066763 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.130560 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-nz4f8"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.206547 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-lvbzp"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.207854 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.213310 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-lvbzp"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.288972 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.289058 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-config\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.289134 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.289171 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqvxr\" (UniqueName: \"kubernetes.io/projected/fc766b6d-f977-4050-a719-5327a3a351e0-kube-api-access-fqvxr\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.289191 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.391259 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-config\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.391356 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.391400 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqvxr\" (UniqueName: \"kubernetes.io/projected/fc766b6d-f977-4050-a719-5327a3a351e0-kube-api-access-fqvxr\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.391419 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.391466 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.392196 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-config\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.392212 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.392715 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.393003 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.411726 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqvxr\" (UniqueName: \"kubernetes.io/projected/fc766b6d-f977-4050-a719-5327a3a351e0-kube-api-access-fqvxr\") pod \"dnsmasq-dns-b4ddd5fb7-lvbzp\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.417002 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.492267 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl6dd\" (UniqueName: \"kubernetes.io/projected/c7224368-9599-456c-a0f4-002960e5f18c-kube-api-access-sl6dd\") pod \"c7224368-9599-456c-a0f4-002960e5f18c\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.492347 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-dns-svc\") pod \"c7224368-9599-456c-a0f4-002960e5f18c\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.492392 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-config\") pod \"c7224368-9599-456c-a0f4-002960e5f18c\" (UID: \"c7224368-9599-456c-a0f4-002960e5f18c\") " Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.495846 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7224368-9599-456c-a0f4-002960e5f18c-kube-api-access-sl6dd" (OuterVolumeSpecName: "kube-api-access-sl6dd") pod "c7224368-9599-456c-a0f4-002960e5f18c" (UID: "c7224368-9599-456c-a0f4-002960e5f18c"). InnerVolumeSpecName "kube-api-access-sl6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.535705 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7224368-9599-456c-a0f4-002960e5f18c" (UID: "c7224368-9599-456c-a0f4-002960e5f18c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.541050 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-config" (OuterVolumeSpecName: "config") pod "c7224368-9599-456c-a0f4-002960e5f18c" (UID: "c7224368-9599-456c-a0f4-002960e5f18c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.544886 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.546299 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.581621 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-tjh5n"] Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.595029 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.595060 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl6dd\" (UniqueName: \"kubernetes.io/projected/c7224368-9599-456c-a0f4-002960e5f18c-kube-api-access-sl6dd\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.595072 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7224368-9599-456c-a0f4-002960e5f18c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.746449 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" event={"ID":"0d76bbee-3d03-4df7-bdea-8aa71418225f","Type":"ContainerStarted","Data":"3afb8abe50174751f83db0fd5f266758c510b692c3e3150f0ee36c42c0c3f39a"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.747909 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"df45d4ef-7350-42d7-a2d7-cede9b13ff55","Type":"ContainerStarted","Data":"19aa151b1dc702acc37945c89b083a19f7e5b4c67ff22d21572d4c0592ce6ba3"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.749401 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q7s8k" event={"ID":"ada5da2f-892a-4d4b-a18c-d641456e9124","Type":"ContainerStarted","Data":"5e3becd250e5c9ac94ce570f873a18eac23fe632704d7d7cba706aa975263dab"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.749421 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q7s8k" event={"ID":"ada5da2f-892a-4d4b-a18c-d641456e9124","Type":"ContainerStarted","Data":"7071fa24917640b9a379d67604eabdd86521bb4d6a626bfd233a999f0566e1af"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.752597 4799 generic.go:334] "Generic (PLEG): container finished" podID="0054f536-14f7-4796-b4a0-bbf9cb1bebec" containerID="83821e04a726bc98862e3b0e6e7a3f64f8d4aa2037764a8132ad5eb0f4c7ec80" exitCode=0 Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.752639 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" event={"ID":"0054f536-14f7-4796-b4a0-bbf9cb1bebec","Type":"ContainerDied","Data":"83821e04a726bc98862e3b0e6e7a3f64f8d4aa2037764a8132ad5eb0f4c7ec80"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.752655 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" event={"ID":"0054f536-14f7-4796-b4a0-bbf9cb1bebec","Type":"ContainerStarted","Data":"320fd9a7a340b73cf34e0035e425fce4e2cb3dd1b3a9b2df2fe982de81a37ff4"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.754894 4799 generic.go:334] "Generic (PLEG): container finished" podID="c7224368-9599-456c-a0f4-002960e5f18c" containerID="6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d" exitCode=0 Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.755529 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.761493 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" event={"ID":"c7224368-9599-456c-a0f4-002960e5f18c","Type":"ContainerDied","Data":"6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.761528 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-26g59" event={"ID":"c7224368-9599-456c-a0f4-002960e5f18c","Type":"ContainerDied","Data":"8af10146d6bd88ce7cab95ee3f5d5f3225bb3dee290d9e83abd7002c4bb040d5"} Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.761545 4799 scope.go:117] "RemoveContainer" containerID="6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.761976 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerName="dnsmasq-dns" containerID="cri-o://abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db" gracePeriod=10 Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.777409 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-q7s8k" podStartSLOduration=1.777377534 podStartE2EDuration="1.777377534s" podCreationTimestamp="2026-03-19 20:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:22:24.767718269 +0000 UTC m=+1022.373671361" watchObservedRunningTime="2026-03-19 20:22:24.777377534 +0000 UTC m=+1022.383330606" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.789855 4799 scope.go:117] "RemoveContainer" containerID="d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.837922 4799 scope.go:117] "RemoveContainer" containerID="6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.838068 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-26g59"] Mar 19 20:22:24 crc kubenswrapper[4799]: E0319 20:22:24.838705 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d\": container with ID starting with 6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d not found: ID does not exist" containerID="6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.838743 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d"} err="failed to get container status \"6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d\": rpc error: code = NotFound desc = could not find container \"6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d\": container with ID starting with 6e3dabec7baeb89eca4c5f4a4f3ce8644d10aa1c92b252b136bf571d41d0ee7d not found: ID does not exist" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.838770 4799 scope.go:117] "RemoveContainer" containerID="d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3" Mar 19 20:22:24 crc kubenswrapper[4799]: E0319 20:22:24.839115 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3\": container with ID starting with d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3 not found: ID does not exist" containerID="d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.839164 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3"} err="failed to get container status \"d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3\": rpc error: code = NotFound desc = could not find container \"d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3\": container with ID starting with d3e9270b7e762e6f9159421f69abc52075013f30784eedf998bc9afc5326b8a3 not found: ID does not exist" Mar 19 20:22:24 crc kubenswrapper[4799]: I0319 20:22:24.843202 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-26g59"] Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.009138 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-lvbzp"] Mar 19 20:22:25 crc kubenswrapper[4799]: W0319 20:22:25.061451 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc766b6d_f977_4050_a719_5327a3a351e0.slice/crio-403af416d84fd1a8d54fa7e5b72c12b61a927d24f057a20248533f88b83f9c23 WatchSource:0}: Error finding container 403af416d84fd1a8d54fa7e5b72c12b61a927d24f057a20248533f88b83f9c23: Status 404 returned error can't find the container with id 403af416d84fd1a8d54fa7e5b72c12b61a927d24f057a20248533f88b83f9c23 Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.126956 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7224368-9599-456c-a0f4-002960e5f18c" path="/var/lib/kubelet/pods/c7224368-9599-456c-a0f4-002960e5f18c/volumes" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.172022 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.266070 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.266717 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7224368-9599-456c-a0f4-002960e5f18c" containerName="dnsmasq-dns" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.266737 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7224368-9599-456c-a0f4-002960e5f18c" containerName="dnsmasq-dns" Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.266753 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7224368-9599-456c-a0f4-002960e5f18c" containerName="init" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.266761 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7224368-9599-456c-a0f4-002960e5f18c" containerName="init" Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.266772 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0054f536-14f7-4796-b4a0-bbf9cb1bebec" containerName="init" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.266780 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0054f536-14f7-4796-b4a0-bbf9cb1bebec" containerName="init" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.267004 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7224368-9599-456c-a0f4-002960e5f18c" containerName="dnsmasq-dns" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.267026 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0054f536-14f7-4796-b4a0-bbf9cb1bebec" containerName="init" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.272778 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.276597 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qrgm8" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.276933 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.277102 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.277277 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.290574 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.326128 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-ovsdbserver-nb\") pod \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.326216 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58xmp\" (UniqueName: \"kubernetes.io/projected/0054f536-14f7-4796-b4a0-bbf9cb1bebec-kube-api-access-58xmp\") pod \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.326234 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-dns-svc\") pod \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.326272 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-config\") pod \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\" (UID: \"0054f536-14f7-4796-b4a0-bbf9cb1bebec\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.334006 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0054f536-14f7-4796-b4a0-bbf9cb1bebec-kube-api-access-58xmp" (OuterVolumeSpecName: "kube-api-access-58xmp") pod "0054f536-14f7-4796-b4a0-bbf9cb1bebec" (UID: "0054f536-14f7-4796-b4a0-bbf9cb1bebec"). InnerVolumeSpecName "kube-api-access-58xmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.336471 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.397945 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0054f536-14f7-4796-b4a0-bbf9cb1bebec" (UID: "0054f536-14f7-4796-b4a0-bbf9cb1bebec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.402339 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-config" (OuterVolumeSpecName: "config") pod "0054f536-14f7-4796-b4a0-bbf9cb1bebec" (UID: "0054f536-14f7-4796-b4a0-bbf9cb1bebec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.407985 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0054f536-14f7-4796-b4a0-bbf9cb1bebec" (UID: "0054f536-14f7-4796-b4a0-bbf9cb1bebec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428524 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428577 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-cache\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428601 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428660 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-lock\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428693 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkk9\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-kube-api-access-fmkk9\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428758 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428824 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428836 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58xmp\" (UniqueName: \"kubernetes.io/projected/0054f536-14f7-4796-b4a0-bbf9cb1bebec-kube-api-access-58xmp\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428845 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.428853 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0054f536-14f7-4796-b4a0-bbf9cb1bebec-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531131 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-728kn\" (UniqueName: \"kubernetes.io/projected/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-kube-api-access-728kn\") pod \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531191 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-config\") pod \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531338 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-dns-svc\") pod \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\" (UID: \"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c\") " Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531639 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-cache\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531681 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531732 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-lock\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531767 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkk9\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-kube-api-access-fmkk9\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531835 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.531906 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.532061 4799 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.532078 4799 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.532132 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift podName:fa0c0465-9e46-41e7-88b3-07a6da9cd6c7 nodeName:}" failed. No retries permitted until 2026-03-19 20:22:26.032111114 +0000 UTC m=+1023.638064186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift") pod "swift-storage-0" (UID: "fa0c0465-9e46-41e7-88b3-07a6da9cd6c7") : configmap "swift-ring-files" not found Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.532878 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-lock\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.534407 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-kube-api-access-728kn" (OuterVolumeSpecName: "kube-api-access-728kn") pod "ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" (UID: "ca0f5c48-f13c-4bc5-a79e-137f7a039e8c"). InnerVolumeSpecName "kube-api-access-728kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.534914 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-cache\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.540503 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.541213 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.570578 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.573667 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkk9\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-kube-api-access-fmkk9\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.577243 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-config" (OuterVolumeSpecName: "config") pod "ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" (UID: "ca0f5c48-f13c-4bc5-a79e-137f7a039e8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.582987 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" (UID: "ca0f5c48-f13c-4bc5-a79e-137f7a039e8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.633746 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.633791 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-728kn\" (UniqueName: \"kubernetes.io/projected/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-kube-api-access-728kn\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.633810 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.775309 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerID="e7797d4a29166b178223d9f998c8d44c91b60734b22bc567ddfaed7013d05cfb" exitCode=0 Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.775435 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" event={"ID":"0d76bbee-3d03-4df7-bdea-8aa71418225f","Type":"ContainerDied","Data":"e7797d4a29166b178223d9f998c8d44c91b60734b22bc567ddfaed7013d05cfb"} Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.777890 4799 generic.go:334] "Generic (PLEG): container finished" podID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerID="abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db" exitCode=0 Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.777938 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.777959 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" event={"ID":"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c","Type":"ContainerDied","Data":"abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db"} Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.777986 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-zpfvp" event={"ID":"ca0f5c48-f13c-4bc5-a79e-137f7a039e8c","Type":"ContainerDied","Data":"569aeb9ee175609092c06c737f20617a5fdb92f90f4b8a3923b80776cf472540"} Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.778035 4799 scope.go:117] "RemoveContainer" containerID="abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.782551 4799 generic.go:334] "Generic (PLEG): container finished" podID="fc766b6d-f977-4050-a719-5327a3a351e0" containerID="7d2a71c97bbe34456f4119637ade92cc9641790c4f95dc448432908cbc0bf4b0" exitCode=0 Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.782743 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" event={"ID":"fc766b6d-f977-4050-a719-5327a3a351e0","Type":"ContainerDied","Data":"7d2a71c97bbe34456f4119637ade92cc9641790c4f95dc448432908cbc0bf4b0"} Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.782776 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" event={"ID":"fc766b6d-f977-4050-a719-5327a3a351e0","Type":"ContainerStarted","Data":"403af416d84fd1a8d54fa7e5b72c12b61a927d24f057a20248533f88b83f9c23"} Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.785826 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.786019 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-nz4f8" event={"ID":"0054f536-14f7-4796-b4a0-bbf9cb1bebec","Type":"ContainerDied","Data":"320fd9a7a340b73cf34e0035e425fce4e2cb3dd1b3a9b2df2fe982de81a37ff4"} Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.798008 4799 scope.go:117] "RemoveContainer" containerID="d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.826159 4799 scope.go:117] "RemoveContainer" containerID="abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db" Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.826688 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db\": container with ID starting with abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db not found: ID does not exist" containerID="abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.826735 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db"} err="failed to get container status \"abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db\": rpc error: code = NotFound desc = could not find container \"abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db\": container with ID starting with abc95e08cee8062bf1454408030a06943332b7f017cffe26130c8ef15eaf12db not found: ID does not exist" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.826788 4799 scope.go:117] "RemoveContainer" containerID="d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee" Mar 19 20:22:25 crc kubenswrapper[4799]: E0319 20:22:25.827053 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee\": container with ID starting with d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee not found: ID does not exist" containerID="d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.827101 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee"} err="failed to get container status \"d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee\": rpc error: code = NotFound desc = could not find container \"d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee\": container with ID starting with d75ddcc88198b2a7ac902e941ca98de9dd7fa90461e557bdfdd60017720ccfee not found: ID does not exist" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.827118 4799 scope.go:117] "RemoveContainer" containerID="83821e04a726bc98862e3b0e6e7a3f64f8d4aa2037764a8132ad5eb0f4c7ec80" Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.854421 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-zpfvp"] Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.865993 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-zpfvp"] Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.896966 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-nz4f8"] Mar 19 20:22:25 crc kubenswrapper[4799]: I0319 20:22:25.908776 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-nz4f8"] Mar 19 20:22:26 crc kubenswrapper[4799]: I0319 20:22:26.040327 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:26 crc kubenswrapper[4799]: E0319 20:22:26.040603 4799 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 20:22:26 crc kubenswrapper[4799]: E0319 20:22:26.040621 4799 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 20:22:26 crc kubenswrapper[4799]: E0319 20:22:26.040661 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift podName:fa0c0465-9e46-41e7-88b3-07a6da9cd6c7 nodeName:}" failed. No retries permitted until 2026-03-19 20:22:27.040648808 +0000 UTC m=+1024.646601880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift") pod "swift-storage-0" (UID: "fa0c0465-9e46-41e7-88b3-07a6da9cd6c7") : configmap "swift-ring-files" not found Mar 19 20:22:26 crc kubenswrapper[4799]: E0319 20:22:26.081284 4799 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 20:22:26 crc kubenswrapper[4799]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/0d76bbee-3d03-4df7-bdea-8aa71418225f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 20:22:26 crc kubenswrapper[4799]: > podSandboxID="3afb8abe50174751f83db0fd5f266758c510b692c3e3150f0ee36c42c0c3f39a" Mar 19 20:22:26 crc kubenswrapper[4799]: E0319 20:22:26.081453 4799 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 20:22:26 crc kubenswrapper[4799]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fgxjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f697c8bff-tjh5n_openstack(0d76bbee-3d03-4df7-bdea-8aa71418225f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/0d76bbee-3d03-4df7-bdea-8aa71418225f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 20:22:26 crc kubenswrapper[4799]: > logger="UnhandledError" Mar 19 20:22:26 crc kubenswrapper[4799]: E0319 20:22:26.083311 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/0d76bbee-3d03-4df7-bdea-8aa71418225f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" Mar 19 20:22:27 crc kubenswrapper[4799]: I0319 20:22:27.062601 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:27 crc kubenswrapper[4799]: E0319 20:22:27.063298 4799 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 20:22:27 crc kubenswrapper[4799]: E0319 20:22:27.063332 4799 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 20:22:27 crc kubenswrapper[4799]: E0319 20:22:27.063438 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift podName:fa0c0465-9e46-41e7-88b3-07a6da9cd6c7 nodeName:}" failed. No retries permitted until 2026-03-19 20:22:29.063412132 +0000 UTC m=+1026.669365234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift") pod "swift-storage-0" (UID: "fa0c0465-9e46-41e7-88b3-07a6da9cd6c7") : configmap "swift-ring-files" not found Mar 19 20:22:27 crc kubenswrapper[4799]: I0319 20:22:27.130585 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0054f536-14f7-4796-b4a0-bbf9cb1bebec" path="/var/lib/kubelet/pods/0054f536-14f7-4796-b4a0-bbf9cb1bebec/volumes" Mar 19 20:22:27 crc kubenswrapper[4799]: I0319 20:22:27.131456 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" path="/var/lib/kubelet/pods/ca0f5c48-f13c-4bc5-a79e-137f7a039e8c/volumes" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.098301 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:29 crc kubenswrapper[4799]: E0319 20:22:29.098592 4799 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 20:22:29 crc kubenswrapper[4799]: E0319 20:22:29.098844 4799 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 20:22:29 crc kubenswrapper[4799]: E0319 20:22:29.098931 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift podName:fa0c0465-9e46-41e7-88b3-07a6da9cd6c7 nodeName:}" failed. No retries permitted until 2026-03-19 20:22:33.098907066 +0000 UTC m=+1030.704860168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift") pod "swift-storage-0" (UID: "fa0c0465-9e46-41e7-88b3-07a6da9cd6c7") : configmap "swift-ring-files" not found Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.171423 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-stws7"] Mar 19 20:22:29 crc kubenswrapper[4799]: E0319 20:22:29.172191 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerName="init" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.172214 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerName="init" Mar 19 20:22:29 crc kubenswrapper[4799]: E0319 20:22:29.172258 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerName="dnsmasq-dns" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.172266 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerName="dnsmasq-dns" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.172658 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0f5c48-f13c-4bc5-a79e-137f7a039e8c" containerName="dnsmasq-dns" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.173677 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.177896 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.177963 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.183096 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.207966 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-stws7"] Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.306529 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-swiftconf\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.306676 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-ring-data-devices\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.306926 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-combined-ca-bundle\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.307014 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f8bd39c-7709-4713-b7f6-9713873dae5b-etc-swift\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.307101 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-dispersionconf\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.307198 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcs5n\" (UniqueName: \"kubernetes.io/projected/1f8bd39c-7709-4713-b7f6-9713873dae5b-kube-api-access-bcs5n\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.307270 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-scripts\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.407814 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-combined-ca-bundle\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408262 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f8bd39c-7709-4713-b7f6-9713873dae5b-etc-swift\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408294 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-dispersionconf\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408339 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcs5n\" (UniqueName: \"kubernetes.io/projected/1f8bd39c-7709-4713-b7f6-9713873dae5b-kube-api-access-bcs5n\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408371 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-scripts\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408435 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-swiftconf\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408467 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-ring-data-devices\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.408713 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f8bd39c-7709-4713-b7f6-9713873dae5b-etc-swift\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.409314 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-ring-data-devices\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.409451 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-scripts\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.416620 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-swiftconf\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.417809 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-dispersionconf\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.441956 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcs5n\" (UniqueName: \"kubernetes.io/projected/1f8bd39c-7709-4713-b7f6-9713873dae5b-kube-api-access-bcs5n\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.461329 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-combined-ca-bundle\") pod \"swift-ring-rebalance-stws7\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.515736 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.826912 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" event={"ID":"fc766b6d-f977-4050-a719-5327a3a351e0","Type":"ContainerStarted","Data":"5676b690020210ecb247cd185ae33755305c7ceb5fc992bfa9b28ea2f8c95f5a"} Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.827217 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.829303 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" event={"ID":"0d76bbee-3d03-4df7-bdea-8aa71418225f","Type":"ContainerStarted","Data":"1537bce83c2c158e61e4acd2a4abaac0bbbb687600a47cb12bce2d0b71303420"} Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.829557 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.852241 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" podStartSLOduration=5.852221286 podStartE2EDuration="5.852221286s" podCreationTimestamp="2026-03-19 20:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:22:29.849489402 +0000 UTC m=+1027.455442484" watchObservedRunningTime="2026-03-19 20:22:29.852221286 +0000 UTC m=+1027.458174368" Mar 19 20:22:29 crc kubenswrapper[4799]: I0319 20:22:29.867378 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" podStartSLOduration=6.8673615120000004 podStartE2EDuration="6.867361512s" podCreationTimestamp="2026-03-19 20:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:22:29.865978394 +0000 UTC m=+1027.471931466" watchObservedRunningTime="2026-03-19 20:22:29.867361512 +0000 UTC m=+1027.473314584" Mar 19 20:22:29 crc kubenswrapper[4799]: W0319 20:22:29.998793 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f8bd39c_7709_4713_b7f6_9713873dae5b.slice/crio-a01e5b95b9d870dee8258025cf732e1c5a3273961e8ebd035f5eee39efc63943 WatchSource:0}: Error finding container a01e5b95b9d870dee8258025cf732e1c5a3273961e8ebd035f5eee39efc63943: Status 404 returned error can't find the container with id a01e5b95b9d870dee8258025cf732e1c5a3273961e8ebd035f5eee39efc63943 Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.000605 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-stws7"] Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.269584 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.269865 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.837506 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-stws7" event={"ID":"1f8bd39c-7709-4713-b7f6-9713873dae5b","Type":"ContainerStarted","Data":"a01e5b95b9d870dee8258025cf732e1c5a3273961e8ebd035f5eee39efc63943"} Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.839337 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"df45d4ef-7350-42d7-a2d7-cede9b13ff55","Type":"ContainerStarted","Data":"065c3b2d20b721e93d06b66dd074a9f89c32c22c744164c8ecb5051e4b4a99c9"} Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.839407 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"df45d4ef-7350-42d7-a2d7-cede9b13ff55","Type":"ContainerStarted","Data":"6d25edad65292c92c9c016404c41df1688cb6bd92aab7094006afe9038ab5846"} Mar 19 20:22:30 crc kubenswrapper[4799]: I0319 20:22:30.876102 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.462157585 podStartE2EDuration="7.876077801s" podCreationTimestamp="2026-03-19 20:22:23 +0000 UTC" firstStartedPulling="2026-03-19 20:22:24.555833145 +0000 UTC m=+1022.161786217" lastFinishedPulling="2026-03-19 20:22:29.969753351 +0000 UTC m=+1027.575706433" observedRunningTime="2026-03-19 20:22:30.863113435 +0000 UTC m=+1028.469066547" watchObservedRunningTime="2026-03-19 20:22:30.876077801 +0000 UTC m=+1028.482030883" Mar 19 20:22:31 crc kubenswrapper[4799]: I0319 20:22:31.714295 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 20:22:31 crc kubenswrapper[4799]: I0319 20:22:31.714449 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 20:22:31 crc kubenswrapper[4799]: I0319 20:22:31.797520 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 20:22:31 crc kubenswrapper[4799]: I0319 20:22:31.845405 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 20:22:31 crc kubenswrapper[4799]: I0319 20:22:31.921134 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 20:22:32 crc kubenswrapper[4799]: I0319 20:22:32.663418 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 20:22:32 crc kubenswrapper[4799]: I0319 20:22:32.767529 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.032670 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a833-account-create-update-ps2nk"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.033920 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.040332 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.048155 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a833-account-create-update-ps2nk"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.157213 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-tm2mp"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.179182 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tm2mp"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.179219 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sqxfw"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.180437 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.180817 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.181729 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc4t9\" (UniqueName: \"kubernetes.io/projected/6023c298-3c8c-4d62-9f55-c55bede668e6-kube-api-access-cc4t9\") pod \"keystone-a833-account-create-update-ps2nk\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.181792 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6023c298-3c8c-4d62-9f55-c55bede668e6-operator-scripts\") pod \"keystone-a833-account-create-update-ps2nk\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.181822 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:33 crc kubenswrapper[4799]: E0319 20:22:33.181990 4799 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 20:22:33 crc kubenswrapper[4799]: E0319 20:22:33.182002 4799 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 20:22:33 crc kubenswrapper[4799]: E0319 20:22:33.182044 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift podName:fa0c0465-9e46-41e7-88b3-07a6da9cd6c7 nodeName:}" failed. No retries permitted until 2026-03-19 20:22:41.182030667 +0000 UTC m=+1038.787983739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift") pod "swift-storage-0" (UID: "fa0c0465-9e46-41e7-88b3-07a6da9cd6c7") : configmap "swift-ring-files" not found Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.186952 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqxfw"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.232317 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ebac-account-create-update-f6j6j"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.233453 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.235541 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.262543 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ebac-account-create-update-f6j6j"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.282917 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc4t9\" (UniqueName: \"kubernetes.io/projected/6023c298-3c8c-4d62-9f55-c55bede668e6-kube-api-access-cc4t9\") pod \"keystone-a833-account-create-update-ps2nk\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.282976 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-operator-scripts\") pod \"placement-ebac-account-create-update-f6j6j\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283028 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6023c298-3c8c-4d62-9f55-c55bede668e6-operator-scripts\") pod \"keystone-a833-account-create-update-ps2nk\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283254 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-utilities\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283323 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mbg\" (UniqueName: \"kubernetes.io/projected/41613e4d-5e3c-4d83-80be-159885c967bf-kube-api-access-d9mbg\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283445 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-operator-scripts\") pod \"keystone-db-create-tm2mp\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283504 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-catalog-content\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283605 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44brc\" (UniqueName: \"kubernetes.io/projected/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-kube-api-access-44brc\") pod \"placement-ebac-account-create-update-f6j6j\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283643 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k292l\" (UniqueName: \"kubernetes.io/projected/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-kube-api-access-k292l\") pod \"keystone-db-create-tm2mp\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.283740 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6023c298-3c8c-4d62-9f55-c55bede668e6-operator-scripts\") pod \"keystone-a833-account-create-update-ps2nk\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.313427 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc4t9\" (UniqueName: \"kubernetes.io/projected/6023c298-3c8c-4d62-9f55-c55bede668e6-kube-api-access-cc4t9\") pod \"keystone-a833-account-create-update-ps2nk\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.319285 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vk8mz"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.320181 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.337965 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vk8mz"] Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.357322 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385571 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-utilities\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385627 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mbg\" (UniqueName: \"kubernetes.io/projected/41613e4d-5e3c-4d83-80be-159885c967bf-kube-api-access-d9mbg\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385656 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-operator-scripts\") pod \"placement-db-create-vk8mz\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385683 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvxn\" (UniqueName: \"kubernetes.io/projected/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-kube-api-access-gnvxn\") pod \"placement-db-create-vk8mz\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385706 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-operator-scripts\") pod \"keystone-db-create-tm2mp\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385729 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-catalog-content\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385764 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44brc\" (UniqueName: \"kubernetes.io/projected/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-kube-api-access-44brc\") pod \"placement-ebac-account-create-update-f6j6j\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385785 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k292l\" (UniqueName: \"kubernetes.io/projected/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-kube-api-access-k292l\") pod \"keystone-db-create-tm2mp\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.385826 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-operator-scripts\") pod \"placement-ebac-account-create-update-f6j6j\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.386142 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-utilities\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.386632 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-operator-scripts\") pod \"placement-ebac-account-create-update-f6j6j\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.386757 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-catalog-content\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.386784 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-operator-scripts\") pod \"keystone-db-create-tm2mp\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.419077 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44brc\" (UniqueName: \"kubernetes.io/projected/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-kube-api-access-44brc\") pod \"placement-ebac-account-create-update-f6j6j\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.419227 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mbg\" (UniqueName: \"kubernetes.io/projected/41613e4d-5e3c-4d83-80be-159885c967bf-kube-api-access-d9mbg\") pod \"redhat-marketplace-sqxfw\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.419273 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k292l\" (UniqueName: \"kubernetes.io/projected/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-kube-api-access-k292l\") pod \"keystone-db-create-tm2mp\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.486589 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-operator-scripts\") pod \"placement-db-create-vk8mz\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.486639 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvxn\" (UniqueName: \"kubernetes.io/projected/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-kube-api-access-gnvxn\") pod \"placement-db-create-vk8mz\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.487578 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-operator-scripts\") pod \"placement-db-create-vk8mz\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.503676 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvxn\" (UniqueName: \"kubernetes.io/projected/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-kube-api-access-gnvxn\") pod \"placement-db-create-vk8mz\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.512133 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.521025 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.564722 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:33 crc kubenswrapper[4799]: I0319 20:22:33.659402 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.546622 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.601924 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-tjh5n"] Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.602149 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerName="dnsmasq-dns" containerID="cri-o://1537bce83c2c158e61e4acd2a4abaac0bbbb687600a47cb12bce2d0b71303420" gracePeriod=10 Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.606964 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.639272 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ebac-account-create-update-f6j6j"] Mar 19 20:22:34 crc kubenswrapper[4799]: W0319 20:22:34.692659 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f16902_ec65_40c0_bb69_0c0ee2d8b2e2.slice/crio-21b4f53c06b0f909ecb51ef20b8e831788945a2f7be813a05a9a1a09f779f673 WatchSource:0}: Error finding container 21b4f53c06b0f909ecb51ef20b8e831788945a2f7be813a05a9a1a09f779f673: Status 404 returned error can't find the container with id 21b4f53c06b0f909ecb51ef20b8e831788945a2f7be813a05a9a1a09f779f673 Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.710895 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqxfw"] Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.736161 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tm2mp"] Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.830402 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a833-account-create-update-ps2nk"] Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.836566 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vk8mz"] Mar 19 20:22:34 crc kubenswrapper[4799]: W0319 20:22:34.839777 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6023c298_3c8c_4d62_9f55_c55bede668e6.slice/crio-34ed333eac526bec565c79a69a43f1a51e95835903fc9c5fd649a01b1197c356 WatchSource:0}: Error finding container 34ed333eac526bec565c79a69a43f1a51e95835903fc9c5fd649a01b1197c356: Status 404 returned error can't find the container with id 34ed333eac526bec565c79a69a43f1a51e95835903fc9c5fd649a01b1197c356 Mar 19 20:22:34 crc kubenswrapper[4799]: W0319 20:22:34.849734 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dcf216f_4e59_4164_98c4_b13a5ee6ac18.slice/crio-5f4159a331de9cc5e561b6f75dac7a6db7b24e32572de0b63b0dc89a92d38469 WatchSource:0}: Error finding container 5f4159a331de9cc5e561b6f75dac7a6db7b24e32572de0b63b0dc89a92d38469: Status 404 returned error can't find the container with id 5f4159a331de9cc5e561b6f75dac7a6db7b24e32572de0b63b0dc89a92d38469 Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.954722 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vk8mz" event={"ID":"2dcf216f-4e59-4164-98c4-b13a5ee6ac18","Type":"ContainerStarted","Data":"5f4159a331de9cc5e561b6f75dac7a6db7b24e32572de0b63b0dc89a92d38469"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.962245 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ebac-account-create-update-f6j6j" event={"ID":"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2","Type":"ContainerStarted","Data":"21b4f53c06b0f909ecb51ef20b8e831788945a2f7be813a05a9a1a09f779f673"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.964752 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerID="1537bce83c2c158e61e4acd2a4abaac0bbbb687600a47cb12bce2d0b71303420" exitCode=0 Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.964796 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" event={"ID":"0d76bbee-3d03-4df7-bdea-8aa71418225f","Type":"ContainerDied","Data":"1537bce83c2c158e61e4acd2a4abaac0bbbb687600a47cb12bce2d0b71303420"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.965789 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tm2mp" event={"ID":"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf","Type":"ContainerStarted","Data":"ca1188e4d62c80b80e0e2a34d75fd598ad19a023aecd9a56ea921081db135005"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.967044 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerStarted","Data":"d1c38738b7f79499c8a23000522f256b707d4c5bd64cffc0354b710dd051c24f"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.968600 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-stws7" event={"ID":"1f8bd39c-7709-4713-b7f6-9713873dae5b","Type":"ContainerStarted","Data":"b08bfc62ee5f20e90f8bea6a9541849a2f6cabaff655c6c2b27be7bc22c1c1cd"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.977105 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a833-account-create-update-ps2nk" event={"ID":"6023c298-3c8c-4d62-9f55-c55bede668e6","Type":"ContainerStarted","Data":"34ed333eac526bec565c79a69a43f1a51e95835903fc9c5fd649a01b1197c356"} Mar 19 20:22:34 crc kubenswrapper[4799]: I0319 20:22:34.997800 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-stws7" podStartSLOduration=1.817276927 podStartE2EDuration="5.99777797s" podCreationTimestamp="2026-03-19 20:22:29 +0000 UTC" firstStartedPulling="2026-03-19 20:22:30.002033967 +0000 UTC m=+1027.607987039" lastFinishedPulling="2026-03-19 20:22:34.18253501 +0000 UTC m=+1031.788488082" observedRunningTime="2026-03-19 20:22:34.992917167 +0000 UTC m=+1032.598870239" watchObservedRunningTime="2026-03-19 20:22:34.99777797 +0000 UTC m=+1032.603731042" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.104932 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.217583 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-dns-svc\") pod \"0d76bbee-3d03-4df7-bdea-8aa71418225f\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.217639 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-sb\") pod \"0d76bbee-3d03-4df7-bdea-8aa71418225f\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.217691 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-config\") pod \"0d76bbee-3d03-4df7-bdea-8aa71418225f\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.217743 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-nb\") pod \"0d76bbee-3d03-4df7-bdea-8aa71418225f\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.217888 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgxjv\" (UniqueName: \"kubernetes.io/projected/0d76bbee-3d03-4df7-bdea-8aa71418225f-kube-api-access-fgxjv\") pod \"0d76bbee-3d03-4df7-bdea-8aa71418225f\" (UID: \"0d76bbee-3d03-4df7-bdea-8aa71418225f\") " Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.223917 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d76bbee-3d03-4df7-bdea-8aa71418225f-kube-api-access-fgxjv" (OuterVolumeSpecName: "kube-api-access-fgxjv") pod "0d76bbee-3d03-4df7-bdea-8aa71418225f" (UID: "0d76bbee-3d03-4df7-bdea-8aa71418225f"). InnerVolumeSpecName "kube-api-access-fgxjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.256210 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d76bbee-3d03-4df7-bdea-8aa71418225f" (UID: "0d76bbee-3d03-4df7-bdea-8aa71418225f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.257067 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-config" (OuterVolumeSpecName: "config") pod "0d76bbee-3d03-4df7-bdea-8aa71418225f" (UID: "0d76bbee-3d03-4df7-bdea-8aa71418225f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.259880 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d76bbee-3d03-4df7-bdea-8aa71418225f" (UID: "0d76bbee-3d03-4df7-bdea-8aa71418225f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.269143 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d76bbee-3d03-4df7-bdea-8aa71418225f" (UID: "0d76bbee-3d03-4df7-bdea-8aa71418225f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.321476 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.321609 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgxjv\" (UniqueName: \"kubernetes.io/projected/0d76bbee-3d03-4df7-bdea-8aa71418225f-kube-api-access-fgxjv\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.321660 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.321699 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.321719 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d76bbee-3d03-4df7-bdea-8aa71418225f-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.988468 4799 generic.go:334] "Generic (PLEG): container finished" podID="2dcf216f-4e59-4164-98c4-b13a5ee6ac18" containerID="51ddad6e8229f881952601504fb79069a156159dca5061b9e3077fca7e080b46" exitCode=0 Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.988569 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vk8mz" event={"ID":"2dcf216f-4e59-4164-98c4-b13a5ee6ac18","Type":"ContainerDied","Data":"51ddad6e8229f881952601504fb79069a156159dca5061b9e3077fca7e080b46"} Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.990932 4799 generic.go:334] "Generic (PLEG): container finished" podID="43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" containerID="b087ccf55fd7c853a2ceaed6b641d9b27e0dd1af2c6d35efa658aed78f635e7b" exitCode=0 Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.991052 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ebac-account-create-update-f6j6j" event={"ID":"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2","Type":"ContainerDied","Data":"b087ccf55fd7c853a2ceaed6b641d9b27e0dd1af2c6d35efa658aed78f635e7b"} Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.994092 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.994110 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-tjh5n" event={"ID":"0d76bbee-3d03-4df7-bdea-8aa71418225f","Type":"ContainerDied","Data":"3afb8abe50174751f83db0fd5f266758c510b692c3e3150f0ee36c42c0c3f39a"} Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.994189 4799 scope.go:117] "RemoveContainer" containerID="1537bce83c2c158e61e4acd2a4abaac0bbbb687600a47cb12bce2d0b71303420" Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.996588 4799 generic.go:334] "Generic (PLEG): container finished" podID="8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" containerID="16013e65d880800730d4b216afa69c2977af4778be676c2e7a13f0f45c121de2" exitCode=0 Mar 19 20:22:35 crc kubenswrapper[4799]: I0319 20:22:35.996666 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tm2mp" event={"ID":"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf","Type":"ContainerDied","Data":"16013e65d880800730d4b216afa69c2977af4778be676c2e7a13f0f45c121de2"} Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.007080 4799 generic.go:334] "Generic (PLEG): container finished" podID="41613e4d-5e3c-4d83-80be-159885c967bf" containerID="57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857" exitCode=0 Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.007196 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerDied","Data":"57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857"} Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.014594 4799 generic.go:334] "Generic (PLEG): container finished" podID="6023c298-3c8c-4d62-9f55-c55bede668e6" containerID="bdf13e8c567059e2e64dfd82bb6b750237a8bea5962dab58029151e3b6148bc1" exitCode=0 Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.016782 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a833-account-create-update-ps2nk" event={"ID":"6023c298-3c8c-4d62-9f55-c55bede668e6","Type":"ContainerDied","Data":"bdf13e8c567059e2e64dfd82bb6b750237a8bea5962dab58029151e3b6148bc1"} Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.031605 4799 scope.go:117] "RemoveContainer" containerID="e7797d4a29166b178223d9f998c8d44c91b60734b22bc567ddfaed7013d05cfb" Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.160979 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-tjh5n"] Mar 19 20:22:36 crc kubenswrapper[4799]: I0319 20:22:36.169345 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-tjh5n"] Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.032099 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerStarted","Data":"5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece"} Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.136451 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" path="/var/lib/kubelet/pods/0d76bbee-3d03-4df7-bdea-8aa71418225f/volumes" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.497590 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.570452 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6023c298-3c8c-4d62-9f55-c55bede668e6-operator-scripts\") pod \"6023c298-3c8c-4d62-9f55-c55bede668e6\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.570939 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc4t9\" (UniqueName: \"kubernetes.io/projected/6023c298-3c8c-4d62-9f55-c55bede668e6-kube-api-access-cc4t9\") pod \"6023c298-3c8c-4d62-9f55-c55bede668e6\" (UID: \"6023c298-3c8c-4d62-9f55-c55bede668e6\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.572170 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6023c298-3c8c-4d62-9f55-c55bede668e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6023c298-3c8c-4d62-9f55-c55bede668e6" (UID: "6023c298-3c8c-4d62-9f55-c55bede668e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.577746 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6023c298-3c8c-4d62-9f55-c55bede668e6-kube-api-access-cc4t9" (OuterVolumeSpecName: "kube-api-access-cc4t9") pod "6023c298-3c8c-4d62-9f55-c55bede668e6" (UID: "6023c298-3c8c-4d62-9f55-c55bede668e6"). InnerVolumeSpecName "kube-api-access-cc4t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.592104 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.601242 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.610948 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672367 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k292l\" (UniqueName: \"kubernetes.io/projected/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-kube-api-access-k292l\") pod \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672464 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-operator-scripts\") pod \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672534 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnvxn\" (UniqueName: \"kubernetes.io/projected/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-kube-api-access-gnvxn\") pod \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672623 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-operator-scripts\") pod \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\" (UID: \"2dcf216f-4e59-4164-98c4-b13a5ee6ac18\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672641 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-operator-scripts\") pod \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\" (UID: \"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672692 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44brc\" (UniqueName: \"kubernetes.io/projected/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-kube-api-access-44brc\") pod \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\" (UID: \"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2\") " Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.672984 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" (UID: "43f16902-ec65-40c0-bb69-0c0ee2d8b2e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.673060 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6023c298-3c8c-4d62-9f55-c55bede668e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.673059 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dcf216f-4e59-4164-98c4-b13a5ee6ac18" (UID: "2dcf216f-4e59-4164-98c4-b13a5ee6ac18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.673077 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc4t9\" (UniqueName: \"kubernetes.io/projected/6023c298-3c8c-4d62-9f55-c55bede668e6-kube-api-access-cc4t9\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.673162 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" (UID: "8a73293d-f9d4-42f6-b03d-21a7ebb99fbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.676508 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-kube-api-access-k292l" (OuterVolumeSpecName: "kube-api-access-k292l") pod "8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" (UID: "8a73293d-f9d4-42f6-b03d-21a7ebb99fbf"). InnerVolumeSpecName "kube-api-access-k292l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.676725 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-kube-api-access-gnvxn" (OuterVolumeSpecName: "kube-api-access-gnvxn") pod "2dcf216f-4e59-4164-98c4-b13a5ee6ac18" (UID: "2dcf216f-4e59-4164-98c4-b13a5ee6ac18"). InnerVolumeSpecName "kube-api-access-gnvxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.677231 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-kube-api-access-44brc" (OuterVolumeSpecName: "kube-api-access-44brc") pod "43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" (UID: "43f16902-ec65-40c0-bb69-0c0ee2d8b2e2"). InnerVolumeSpecName "kube-api-access-44brc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.774220 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k292l\" (UniqueName: \"kubernetes.io/projected/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-kube-api-access-k292l\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.774454 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.774566 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnvxn\" (UniqueName: \"kubernetes.io/projected/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-kube-api-access-gnvxn\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.774639 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcf216f-4e59-4164-98c4-b13a5ee6ac18-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.774719 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:37 crc kubenswrapper[4799]: I0319 20:22:37.774795 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44brc\" (UniqueName: \"kubernetes.io/projected/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2-kube-api-access-44brc\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.069257 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ebac-account-create-update-f6j6j" event={"ID":"43f16902-ec65-40c0-bb69-0c0ee2d8b2e2","Type":"ContainerDied","Data":"21b4f53c06b0f909ecb51ef20b8e831788945a2f7be813a05a9a1a09f779f673"} Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.071044 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21b4f53c06b0f909ecb51ef20b8e831788945a2f7be813a05a9a1a09f779f673" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.069484 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ebac-account-create-update-f6j6j" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.074457 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tm2mp" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.074466 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tm2mp" event={"ID":"8a73293d-f9d4-42f6-b03d-21a7ebb99fbf","Type":"ContainerDied","Data":"ca1188e4d62c80b80e0e2a34d75fd598ad19a023aecd9a56ea921081db135005"} Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.074551 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca1188e4d62c80b80e0e2a34d75fd598ad19a023aecd9a56ea921081db135005" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.078827 4799 generic.go:334] "Generic (PLEG): container finished" podID="41613e4d-5e3c-4d83-80be-159885c967bf" containerID="5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece" exitCode=0 Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.078926 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerDied","Data":"5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece"} Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.086753 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a833-account-create-update-ps2nk" event={"ID":"6023c298-3c8c-4d62-9f55-c55bede668e6","Type":"ContainerDied","Data":"34ed333eac526bec565c79a69a43f1a51e95835903fc9c5fd649a01b1197c356"} Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.086795 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ed333eac526bec565c79a69a43f1a51e95835903fc9c5fd649a01b1197c356" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.086804 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a833-account-create-update-ps2nk" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.089965 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vk8mz" event={"ID":"2dcf216f-4e59-4164-98c4-b13a5ee6ac18","Type":"ContainerDied","Data":"5f4159a331de9cc5e561b6f75dac7a6db7b24e32572de0b63b0dc89a92d38469"} Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.090028 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vk8mz" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.090030 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f4159a331de9cc5e561b6f75dac7a6db7b24e32572de0b63b0dc89a92d38469" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.181696 4799 scope.go:117] "RemoveContainer" containerID="996eb7018a44b89623da1ef36c83afec8f759bfe5dc6310c886ef2b0962ebe46" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.942772 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xqbjx"] Mar 19 20:22:38 crc kubenswrapper[4799]: E0319 20:22:38.943791 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerName="init" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.943820 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerName="init" Mar 19 20:22:38 crc kubenswrapper[4799]: E0319 20:22:38.943869 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dcf216f-4e59-4164-98c4-b13a5ee6ac18" containerName="mariadb-database-create" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.943882 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcf216f-4e59-4164-98c4-b13a5ee6ac18" containerName="mariadb-database-create" Mar 19 20:22:38 crc kubenswrapper[4799]: E0319 20:22:38.943900 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerName="dnsmasq-dns" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.943911 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerName="dnsmasq-dns" Mar 19 20:22:38 crc kubenswrapper[4799]: E0319 20:22:38.943933 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" containerName="mariadb-account-create-update" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.943944 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" containerName="mariadb-account-create-update" Mar 19 20:22:38 crc kubenswrapper[4799]: E0319 20:22:38.943960 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6023c298-3c8c-4d62-9f55-c55bede668e6" containerName="mariadb-account-create-update" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.943973 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6023c298-3c8c-4d62-9f55-c55bede668e6" containerName="mariadb-account-create-update" Mar 19 20:22:38 crc kubenswrapper[4799]: E0319 20:22:38.944001 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" containerName="mariadb-database-create" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.944012 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" containerName="mariadb-database-create" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.944254 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dcf216f-4e59-4164-98c4-b13a5ee6ac18" containerName="mariadb-database-create" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.944274 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" containerName="mariadb-account-create-update" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.944300 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6023c298-3c8c-4d62-9f55-c55bede668e6" containerName="mariadb-account-create-update" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.944318 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" containerName="mariadb-database-create" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.944335 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d76bbee-3d03-4df7-bdea-8aa71418225f" containerName="dnsmasq-dns" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.945170 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.950864 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.959757 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xqbjx"] Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.999367 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhlt\" (UniqueName: \"kubernetes.io/projected/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-kube-api-access-xkhlt\") pod \"root-account-create-update-xqbjx\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:38 crc kubenswrapper[4799]: I0319 20:22:38.999499 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-operator-scripts\") pod \"root-account-create-update-xqbjx\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.101026 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhlt\" (UniqueName: \"kubernetes.io/projected/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-kube-api-access-xkhlt\") pod \"root-account-create-update-xqbjx\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.101078 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-operator-scripts\") pod \"root-account-create-update-xqbjx\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.101350 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerStarted","Data":"3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9"} Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.101973 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-operator-scripts\") pod \"root-account-create-update-xqbjx\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.132732 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhlt\" (UniqueName: \"kubernetes.io/projected/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-kube-api-access-xkhlt\") pod \"root-account-create-update-xqbjx\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.135369 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sqxfw" podStartSLOduration=3.66462743 podStartE2EDuration="6.135348705s" podCreationTimestamp="2026-03-19 20:22:33 +0000 UTC" firstStartedPulling="2026-03-19 20:22:36.010546401 +0000 UTC m=+1033.616499473" lastFinishedPulling="2026-03-19 20:22:38.481267636 +0000 UTC m=+1036.087220748" observedRunningTime="2026-03-19 20:22:39.127697514 +0000 UTC m=+1036.733650596" watchObservedRunningTime="2026-03-19 20:22:39.135348705 +0000 UTC m=+1036.741301787" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.263622 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:39 crc kubenswrapper[4799]: I0319 20:22:39.786478 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xqbjx"] Mar 19 20:22:39 crc kubenswrapper[4799]: W0319 20:22:39.793518 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d28cdd7_eb49_4be0_b47e_bc61f80a2349.slice/crio-d4f55798d858aaf3637e18d8956757e39ab7e9ea34d77130d31a0c8ec86c89d9 WatchSource:0}: Error finding container d4f55798d858aaf3637e18d8956757e39ab7e9ea34d77130d31a0c8ec86c89d9: Status 404 returned error can't find the container with id d4f55798d858aaf3637e18d8956757e39ab7e9ea34d77130d31a0c8ec86c89d9 Mar 19 20:22:40 crc kubenswrapper[4799]: I0319 20:22:40.113294 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xqbjx" event={"ID":"5d28cdd7-eb49-4be0-b47e-bc61f80a2349","Type":"ContainerStarted","Data":"9578059facd4d2e677a536d42c8c66afd21b7edf18ff670903facc65d9a754a8"} Mar 19 20:22:40 crc kubenswrapper[4799]: I0319 20:22:40.113823 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xqbjx" event={"ID":"5d28cdd7-eb49-4be0-b47e-bc61f80a2349","Type":"ContainerStarted","Data":"d4f55798d858aaf3637e18d8956757e39ab7e9ea34d77130d31a0c8ec86c89d9"} Mar 19 20:22:40 crc kubenswrapper[4799]: I0319 20:22:40.133967 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xqbjx" podStartSLOduration=2.133935596 podStartE2EDuration="2.133935596s" podCreationTimestamp="2026-03-19 20:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:22:40.130345037 +0000 UTC m=+1037.736298109" watchObservedRunningTime="2026-03-19 20:22:40.133935596 +0000 UTC m=+1037.739888678" Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.129872 4799 generic.go:334] "Generic (PLEG): container finished" podID="5d28cdd7-eb49-4be0-b47e-bc61f80a2349" containerID="9578059facd4d2e677a536d42c8c66afd21b7edf18ff670903facc65d9a754a8" exitCode=0 Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.129964 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xqbjx" event={"ID":"5d28cdd7-eb49-4be0-b47e-bc61f80a2349","Type":"ContainerDied","Data":"9578059facd4d2e677a536d42c8c66afd21b7edf18ff670903facc65d9a754a8"} Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.131764 4799 generic.go:334] "Generic (PLEG): container finished" podID="1f8bd39c-7709-4713-b7f6-9713873dae5b" containerID="b08bfc62ee5f20e90f8bea6a9541849a2f6cabaff655c6c2b27be7bc22c1c1cd" exitCode=0 Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.131818 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-stws7" event={"ID":"1f8bd39c-7709-4713-b7f6-9713873dae5b","Type":"ContainerDied","Data":"b08bfc62ee5f20e90f8bea6a9541849a2f6cabaff655c6c2b27be7bc22c1c1cd"} Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.239580 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.247284 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa0c0465-9e46-41e7-88b3-07a6da9cd6c7-etc-swift\") pod \"swift-storage-0\" (UID: \"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7\") " pod="openstack/swift-storage-0" Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.254504 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 20:22:41 crc kubenswrapper[4799]: I0319 20:22:41.868800 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 20:22:41 crc kubenswrapper[4799]: W0319 20:22:41.877872 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0c0465_9e46_41e7_88b3_07a6da9cd6c7.slice/crio-c66bc1d915d3548b53ad49de4a6104ed319eb9b287478fddf68eaa79d6ad88df WatchSource:0}: Error finding container c66bc1d915d3548b53ad49de4a6104ed319eb9b287478fddf68eaa79d6ad88df: Status 404 returned error can't find the container with id c66bc1d915d3548b53ad49de4a6104ed319eb9b287478fddf68eaa79d6ad88df Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.152254 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"c66bc1d915d3548b53ad49de4a6104ed319eb9b287478fddf68eaa79d6ad88df"} Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.247852 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4z9kc"] Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.249568 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.260301 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4z9kc"] Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.337787 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-60fe-account-create-update-x4bxp"] Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.338722 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.340878 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.354226 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-60fe-account-create-update-x4bxp"] Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.362479 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872fa1da-9d43-4344-aee7-383a1f418430-operator-scripts\") pod \"glance-db-create-4z9kc\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.362514 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flfdn\" (UniqueName: \"kubernetes.io/projected/872fa1da-9d43-4344-aee7-383a1f418430-kube-api-access-flfdn\") pod \"glance-db-create-4z9kc\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.362604 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c84329-8a13-4319-95a0-313681e0c23d-operator-scripts\") pod \"glance-60fe-account-create-update-x4bxp\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.362629 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75gm\" (UniqueName: \"kubernetes.io/projected/a3c84329-8a13-4319-95a0-313681e0c23d-kube-api-access-j75gm\") pod \"glance-60fe-account-create-update-x4bxp\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.471732 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c84329-8a13-4319-95a0-313681e0c23d-operator-scripts\") pod \"glance-60fe-account-create-update-x4bxp\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.472290 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75gm\" (UniqueName: \"kubernetes.io/projected/a3c84329-8a13-4319-95a0-313681e0c23d-kube-api-access-j75gm\") pod \"glance-60fe-account-create-update-x4bxp\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.472430 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872fa1da-9d43-4344-aee7-383a1f418430-operator-scripts\") pod \"glance-db-create-4z9kc\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.472450 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flfdn\" (UniqueName: \"kubernetes.io/projected/872fa1da-9d43-4344-aee7-383a1f418430-kube-api-access-flfdn\") pod \"glance-db-create-4z9kc\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.472599 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c84329-8a13-4319-95a0-313681e0c23d-operator-scripts\") pod \"glance-60fe-account-create-update-x4bxp\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.473058 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872fa1da-9d43-4344-aee7-383a1f418430-operator-scripts\") pod \"glance-db-create-4z9kc\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.491753 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.495179 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flfdn\" (UniqueName: \"kubernetes.io/projected/872fa1da-9d43-4344-aee7-383a1f418430-kube-api-access-flfdn\") pod \"glance-db-create-4z9kc\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.495345 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75gm\" (UniqueName: \"kubernetes.io/projected/a3c84329-8a13-4319-95a0-313681e0c23d-kube-api-access-j75gm\") pod \"glance-60fe-account-create-update-x4bxp\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.573923 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-operator-scripts\") pod \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.574024 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhlt\" (UniqueName: \"kubernetes.io/projected/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-kube-api-access-xkhlt\") pod \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\" (UID: \"5d28cdd7-eb49-4be0-b47e-bc61f80a2349\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.575037 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.575054 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d28cdd7-eb49-4be0-b47e-bc61f80a2349" (UID: "5d28cdd7-eb49-4be0-b47e-bc61f80a2349"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.578298 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-kube-api-access-xkhlt" (OuterVolumeSpecName: "kube-api-access-xkhlt") pod "5d28cdd7-eb49-4be0-b47e-bc61f80a2349" (UID: "5d28cdd7-eb49-4be0-b47e-bc61f80a2349"). InnerVolumeSpecName "kube-api-access-xkhlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.616463 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.671339 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.675769 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f8bd39c-7709-4713-b7f6-9713873dae5b-etc-swift\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.675862 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-scripts\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.675906 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-dispersionconf\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.675957 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-ring-data-devices\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.676142 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcs5n\" (UniqueName: \"kubernetes.io/projected/1f8bd39c-7709-4713-b7f6-9713873dae5b-kube-api-access-bcs5n\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.676180 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-combined-ca-bundle\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.676224 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-swiftconf\") pod \"1f8bd39c-7709-4713-b7f6-9713873dae5b\" (UID: \"1f8bd39c-7709-4713-b7f6-9713873dae5b\") " Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.676724 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhlt\" (UniqueName: \"kubernetes.io/projected/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-kube-api-access-xkhlt\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.676752 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d28cdd7-eb49-4be0-b47e-bc61f80a2349-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.676748 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.677304 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8bd39c-7709-4713-b7f6-9713873dae5b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.683219 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8bd39c-7709-4713-b7f6-9713873dae5b-kube-api-access-bcs5n" (OuterVolumeSpecName: "kube-api-access-bcs5n") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "kube-api-access-bcs5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.684523 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.704732 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.705244 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.706092 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-scripts" (OuterVolumeSpecName: "scripts") pod "1f8bd39c-7709-4713-b7f6-9713873dae5b" (UID: "1f8bd39c-7709-4713-b7f6-9713873dae5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778603 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcs5n\" (UniqueName: \"kubernetes.io/projected/1f8bd39c-7709-4713-b7f6-9713873dae5b-kube-api-access-bcs5n\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778659 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778677 4799 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778695 4799 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f8bd39c-7709-4713-b7f6-9713873dae5b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778714 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778731 4799 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f8bd39c-7709-4713-b7f6-9713873dae5b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:42 crc kubenswrapper[4799]: I0319 20:22:42.778747 4799 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f8bd39c-7709-4713-b7f6-9713873dae5b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:43 crc kubenswrapper[4799]: W0319 20:22:43.064314 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod872fa1da_9d43_4344_aee7_383a1f418430.slice/crio-43177806f96d8fcb593d404f929e9d5e3f2b25197456f0019a9200d5b347fce9 WatchSource:0}: Error finding container 43177806f96d8fcb593d404f929e9d5e3f2b25197456f0019a9200d5b347fce9: Status 404 returned error can't find the container with id 43177806f96d8fcb593d404f929e9d5e3f2b25197456f0019a9200d5b347fce9 Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.075830 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4z9kc"] Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.164242 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-60fe-account-create-update-x4bxp"] Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.177226 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-stws7" event={"ID":"1f8bd39c-7709-4713-b7f6-9713873dae5b","Type":"ContainerDied","Data":"a01e5b95b9d870dee8258025cf732e1c5a3273961e8ebd035f5eee39efc63943"} Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.177426 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a01e5b95b9d870dee8258025cf732e1c5a3273961e8ebd035f5eee39efc63943" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.177547 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-stws7" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.180124 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xqbjx" event={"ID":"5d28cdd7-eb49-4be0-b47e-bc61f80a2349","Type":"ContainerDied","Data":"d4f55798d858aaf3637e18d8956757e39ab7e9ea34d77130d31a0c8ec86c89d9"} Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.180180 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f55798d858aaf3637e18d8956757e39ab7e9ea34d77130d31a0c8ec86c89d9" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.180257 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xqbjx" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.189216 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4z9kc" event={"ID":"872fa1da-9d43-4344-aee7-383a1f418430","Type":"ContainerStarted","Data":"43177806f96d8fcb593d404f929e9d5e3f2b25197456f0019a9200d5b347fce9"} Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.513680 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.514298 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.576426 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:43 crc kubenswrapper[4799]: I0319 20:22:43.741986 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.216205 4799 generic.go:334] "Generic (PLEG): container finished" podID="a3c84329-8a13-4319-95a0-313681e0c23d" containerID="34b4912ed5f0443d0eb92fceb8283110c88557fba92e208e16f44c826abf69c3" exitCode=0 Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.216350 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-60fe-account-create-update-x4bxp" event={"ID":"a3c84329-8a13-4319-95a0-313681e0c23d","Type":"ContainerDied","Data":"34b4912ed5f0443d0eb92fceb8283110c88557fba92e208e16f44c826abf69c3"} Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.216907 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-60fe-account-create-update-x4bxp" event={"ID":"a3c84329-8a13-4319-95a0-313681e0c23d","Type":"ContainerStarted","Data":"d68d9ac4d080e41b5666605f97e75beb7c2b0ea29525d5b0c673f51583e716b1"} Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.222252 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"a29bd75000a4e4adf783bd1fa16b0a59cd644c000cf0e9726e5e49d6fa1b3558"} Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.222310 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"f66ff43537c2913e59ecf97240013a184edb83174cb4f1a86465391ba8b1c0b2"} Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.222331 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"bb91361bd0afc8a6eeb0562e1ca804154656bfaabac738d18a3f4537a7a19e7e"} Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.224835 4799 generic.go:334] "Generic (PLEG): container finished" podID="872fa1da-9d43-4344-aee7-383a1f418430" containerID="0b8a51b2d969bdc26cb9c7ea02ba8c4b56045e9a8043090cad3c15ff6eb87dc0" exitCode=0 Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.226062 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4z9kc" event={"ID":"872fa1da-9d43-4344-aee7-383a1f418430","Type":"ContainerDied","Data":"0b8a51b2d969bdc26cb9c7ea02ba8c4b56045e9a8043090cad3c15ff6eb87dc0"} Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.310291 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:44 crc kubenswrapper[4799]: I0319 20:22:44.364203 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqxfw"] Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.239024 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"ebc990cc44468bd482467f21ef117f76472aa8c71f14d8777bc7ba295cee4992"} Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.359502 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xqbjx"] Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.365446 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xqbjx"] Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.605879 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.697035 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.737958 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j75gm\" (UniqueName: \"kubernetes.io/projected/a3c84329-8a13-4319-95a0-313681e0c23d-kube-api-access-j75gm\") pod \"a3c84329-8a13-4319-95a0-313681e0c23d\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.738100 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flfdn\" (UniqueName: \"kubernetes.io/projected/872fa1da-9d43-4344-aee7-383a1f418430-kube-api-access-flfdn\") pod \"872fa1da-9d43-4344-aee7-383a1f418430\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.738168 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872fa1da-9d43-4344-aee7-383a1f418430-operator-scripts\") pod \"872fa1da-9d43-4344-aee7-383a1f418430\" (UID: \"872fa1da-9d43-4344-aee7-383a1f418430\") " Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.738194 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c84329-8a13-4319-95a0-313681e0c23d-operator-scripts\") pod \"a3c84329-8a13-4319-95a0-313681e0c23d\" (UID: \"a3c84329-8a13-4319-95a0-313681e0c23d\") " Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.739110 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/872fa1da-9d43-4344-aee7-383a1f418430-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "872fa1da-9d43-4344-aee7-383a1f418430" (UID: "872fa1da-9d43-4344-aee7-383a1f418430"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.739219 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c84329-8a13-4319-95a0-313681e0c23d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3c84329-8a13-4319-95a0-313681e0c23d" (UID: "a3c84329-8a13-4319-95a0-313681e0c23d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.743948 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c84329-8a13-4319-95a0-313681e0c23d-kube-api-access-j75gm" (OuterVolumeSpecName: "kube-api-access-j75gm") pod "a3c84329-8a13-4319-95a0-313681e0c23d" (UID: "a3c84329-8a13-4319-95a0-313681e0c23d"). InnerVolumeSpecName "kube-api-access-j75gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.744649 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872fa1da-9d43-4344-aee7-383a1f418430-kube-api-access-flfdn" (OuterVolumeSpecName: "kube-api-access-flfdn") pod "872fa1da-9d43-4344-aee7-383a1f418430" (UID: "872fa1da-9d43-4344-aee7-383a1f418430"). InnerVolumeSpecName "kube-api-access-flfdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.839688 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j75gm\" (UniqueName: \"kubernetes.io/projected/a3c84329-8a13-4319-95a0-313681e0c23d-kube-api-access-j75gm\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.839738 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flfdn\" (UniqueName: \"kubernetes.io/projected/872fa1da-9d43-4344-aee7-383a1f418430-kube-api-access-flfdn\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.839752 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/872fa1da-9d43-4344-aee7-383a1f418430-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:45 crc kubenswrapper[4799]: I0319 20:22:45.839761 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c84329-8a13-4319-95a0-313681e0c23d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.254222 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"338287579db43eb84b23aa7514704871a318e01d14dbbad5921bde4e32b24c3b"} Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.256775 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4z9kc" event={"ID":"872fa1da-9d43-4344-aee7-383a1f418430","Type":"ContainerDied","Data":"43177806f96d8fcb593d404f929e9d5e3f2b25197456f0019a9200d5b347fce9"} Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.256812 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43177806f96d8fcb593d404f929e9d5e3f2b25197456f0019a9200d5b347fce9" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.256877 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4z9kc" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.273649 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-60fe-account-create-update-x4bxp" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.273723 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-60fe-account-create-update-x4bxp" event={"ID":"a3c84329-8a13-4319-95a0-313681e0c23d","Type":"ContainerDied","Data":"d68d9ac4d080e41b5666605f97e75beb7c2b0ea29525d5b0c673f51583e716b1"} Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.273833 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sqxfw" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="registry-server" containerID="cri-o://3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9" gracePeriod=2 Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.274502 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d68d9ac4d080e41b5666605f97e75beb7c2b0ea29525d5b0c673f51583e716b1" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.748615 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.857021 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-utilities\") pod \"41613e4d-5e3c-4d83-80be-159885c967bf\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.857120 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9mbg\" (UniqueName: \"kubernetes.io/projected/41613e4d-5e3c-4d83-80be-159885c967bf-kube-api-access-d9mbg\") pod \"41613e4d-5e3c-4d83-80be-159885c967bf\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.857246 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-catalog-content\") pod \"41613e4d-5e3c-4d83-80be-159885c967bf\" (UID: \"41613e4d-5e3c-4d83-80be-159885c967bf\") " Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.858114 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-utilities" (OuterVolumeSpecName: "utilities") pod "41613e4d-5e3c-4d83-80be-159885c967bf" (UID: "41613e4d-5e3c-4d83-80be-159885c967bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.862267 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41613e4d-5e3c-4d83-80be-159885c967bf-kube-api-access-d9mbg" (OuterVolumeSpecName: "kube-api-access-d9mbg") pod "41613e4d-5e3c-4d83-80be-159885c967bf" (UID: "41613e4d-5e3c-4d83-80be-159885c967bf"). InnerVolumeSpecName "kube-api-access-d9mbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.896137 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41613e4d-5e3c-4d83-80be-159885c967bf" (UID: "41613e4d-5e3c-4d83-80be-159885c967bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.960480 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.960562 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9mbg\" (UniqueName: \"kubernetes.io/projected/41613e4d-5e3c-4d83-80be-159885c967bf-kube-api-access-d9mbg\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:46 crc kubenswrapper[4799]: I0319 20:22:46.960581 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41613e4d-5e3c-4d83-80be-159885c967bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.133906 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d28cdd7-eb49-4be0-b47e-bc61f80a2349" path="/var/lib/kubelet/pods/5d28cdd7-eb49-4be0-b47e-bc61f80a2349/volumes" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.289768 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"34959dd823d477065123ad904855d7534b76d01acbb9acb5c5fbf3c1382e34ca"} Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.289812 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"e279bc9034bae4998973170225f1d7098b2e45f8302b0b6a155ea330bdb3b38a"} Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.289827 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"02cf180fcfa72147a02489f9b37d850970798c4e8a1a59c83ef88bf37d1b6881"} Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.292654 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqxfw" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.293454 4799 generic.go:334] "Generic (PLEG): container finished" podID="41613e4d-5e3c-4d83-80be-159885c967bf" containerID="3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9" exitCode=0 Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.292650 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerDied","Data":"3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9"} Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.293571 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqxfw" event={"ID":"41613e4d-5e3c-4d83-80be-159885c967bf","Type":"ContainerDied","Data":"d1c38738b7f79499c8a23000522f256b707d4c5bd64cffc0354b710dd051c24f"} Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.293613 4799 scope.go:117] "RemoveContainer" containerID="3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.321917 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqxfw"] Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.326145 4799 scope.go:117] "RemoveContainer" containerID="5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.328416 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqxfw"] Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.353508 4799 scope.go:117] "RemoveContainer" containerID="57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.379191 4799 scope.go:117] "RemoveContainer" containerID="3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.379674 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9\": container with ID starting with 3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9 not found: ID does not exist" containerID="3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.379707 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9"} err="failed to get container status \"3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9\": rpc error: code = NotFound desc = could not find container \"3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9\": container with ID starting with 3d6d31832505a63133d59e4135e07c828805cc10d43a28470c2b5940964602b9 not found: ID does not exist" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.379726 4799 scope.go:117] "RemoveContainer" containerID="5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.380142 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece\": container with ID starting with 5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece not found: ID does not exist" containerID="5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.380197 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece"} err="failed to get container status \"5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece\": rpc error: code = NotFound desc = could not find container \"5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece\": container with ID starting with 5dec4aec85dde18e901ef087c69cdfdb38448054f4c8c354ba4885e0f5602ece not found: ID does not exist" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.380235 4799 scope.go:117] "RemoveContainer" containerID="57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.380755 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857\": container with ID starting with 57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857 not found: ID does not exist" containerID="57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.380827 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857"} err="failed to get container status \"57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857\": rpc error: code = NotFound desc = could not find container \"57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857\": container with ID starting with 57f9528e305d57ae289a3a2770a15734344d9b0410b33d178db2c8c778205857 not found: ID does not exist" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.598877 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ndlfj"] Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599229 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c84329-8a13-4319-95a0-313681e0c23d" containerName="mariadb-account-create-update" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599245 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c84329-8a13-4319-95a0-313681e0c23d" containerName="mariadb-account-create-update" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599262 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d28cdd7-eb49-4be0-b47e-bc61f80a2349" containerName="mariadb-account-create-update" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599270 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d28cdd7-eb49-4be0-b47e-bc61f80a2349" containerName="mariadb-account-create-update" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599292 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="extract-content" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599300 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="extract-content" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599315 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872fa1da-9d43-4344-aee7-383a1f418430" containerName="mariadb-database-create" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599322 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="872fa1da-9d43-4344-aee7-383a1f418430" containerName="mariadb-database-create" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599339 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="extract-utilities" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599348 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="extract-utilities" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599360 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8bd39c-7709-4713-b7f6-9713873dae5b" containerName="swift-ring-rebalance" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599367 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8bd39c-7709-4713-b7f6-9713873dae5b" containerName="swift-ring-rebalance" Mar 19 20:22:47 crc kubenswrapper[4799]: E0319 20:22:47.599402 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="registry-server" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599410 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="registry-server" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599586 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c84329-8a13-4319-95a0-313681e0c23d" containerName="mariadb-account-create-update" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599599 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d28cdd7-eb49-4be0-b47e-bc61f80a2349" containerName="mariadb-account-create-update" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599612 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" containerName="registry-server" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599625 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8bd39c-7709-4713-b7f6-9713873dae5b" containerName="swift-ring-rebalance" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.599640 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="872fa1da-9d43-4344-aee7-383a1f418430" containerName="mariadb-database-create" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.600245 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.605915 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.608568 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5fzj4" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.624029 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ndlfj"] Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.675823 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-db-sync-config-data\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.675881 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-combined-ca-bundle\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.676173 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-config-data\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.676333 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnm75\" (UniqueName: \"kubernetes.io/projected/ab9a5a1a-cacb-4ae7-a087-e50045584210-kube-api-access-qnm75\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.781210 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-db-sync-config-data\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.781279 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-combined-ca-bundle\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.781405 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-config-data\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.781480 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnm75\" (UniqueName: \"kubernetes.io/projected/ab9a5a1a-cacb-4ae7-a087-e50045584210-kube-api-access-qnm75\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.785315 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-combined-ca-bundle\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.787619 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-config-data\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.788935 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-db-sync-config-data\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.798441 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnm75\" (UniqueName: \"kubernetes.io/projected/ab9a5a1a-cacb-4ae7-a087-e50045584210-kube-api-access-qnm75\") pod \"glance-db-sync-ndlfj\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:47 crc kubenswrapper[4799]: I0319 20:22:47.932837 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ndlfj" Mar 19 20:22:48 crc kubenswrapper[4799]: I0319 20:22:48.668661 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ndlfj"] Mar 19 20:22:48 crc kubenswrapper[4799]: I0319 20:22:48.687153 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-h89k2" podUID="cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8" containerName="ovn-controller" probeResult="failure" output=< Mar 19 20:22:48 crc kubenswrapper[4799]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 20:22:48 crc kubenswrapper[4799]: > Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.125803 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41613e4d-5e3c-4d83-80be-159885c967bf" path="/var/lib/kubelet/pods/41613e4d-5e3c-4d83-80be-159885c967bf/volumes" Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.324935 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ndlfj" event={"ID":"ab9a5a1a-cacb-4ae7-a087-e50045584210","Type":"ContainerStarted","Data":"a946acea1d5911609fca89283788f72d1c9dfab6bfd98d2d59567a3aa6e63ff5"} Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.335844 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"9e3cbfa1a914fda68cb2fbfa59cebb33209025942d983adbcd0ef3ac4575bf5c"} Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.335914 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"43c5fecfcd9a7c98e37820167ce175bb826d37175ca5ac8a180d36d389b458c2"} Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.335938 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"3dea200f546d82212c05c1bdb328d623d7d1212777879a04b53b2c7758826c0e"} Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.335954 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"a3f40065af34b9ac2a0e1ccac42aa172317d145a52572293753d1f7245971309"} Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.335973 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"76ff80036a7afff6edb4b582c6d8bd4c32b82b46a4c773e8c59e4c5f03dd1ccb"} Mar 19 20:22:49 crc kubenswrapper[4799]: I0319 20:22:49.335988 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"97f3c542f6af23f9068f77246d9ecb094bbf9058d61c4ecb6097e82c68943546"} Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.360321 4799 generic.go:334] "Generic (PLEG): container finished" podID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerID="d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074" exitCode=0 Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.360414 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"749a043f-5262-416d-b639-9ff8fdcf7f12","Type":"ContainerDied","Data":"d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074"} Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.363613 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6585v"] Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.363741 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerID="37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e" exitCode=0 Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.365074 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ee15a17-4d32-468e-8a57-2a597cebd850","Type":"ContainerDied","Data":"37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e"} Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.365262 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.369712 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.386966 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6585v"] Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.398250 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa0c0465-9e46-41e7-88b3-07a6da9cd6c7","Type":"ContainerStarted","Data":"825a4976529d167ffd2ca6c069ad661fbfb20d10ba99b324a0d7551b9bcd120f"} Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.466739 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.17200562 podStartE2EDuration="26.466715886s" podCreationTimestamp="2026-03-19 20:22:24 +0000 UTC" firstStartedPulling="2026-03-19 20:22:41.881648612 +0000 UTC m=+1039.487601684" lastFinishedPulling="2026-03-19 20:22:48.176358878 +0000 UTC m=+1045.782311950" observedRunningTime="2026-03-19 20:22:50.46286603 +0000 UTC m=+1048.068819112" watchObservedRunningTime="2026-03-19 20:22:50.466715886 +0000 UTC m=+1048.072668968" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.523033 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c58bd42-1add-4893-96c1-bb363f7e2297-operator-scripts\") pod \"root-account-create-update-6585v\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.523750 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcdpg\" (UniqueName: \"kubernetes.io/projected/3c58bd42-1add-4893-96c1-bb363f7e2297-kube-api-access-bcdpg\") pod \"root-account-create-update-6585v\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.625856 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcdpg\" (UniqueName: \"kubernetes.io/projected/3c58bd42-1add-4893-96c1-bb363f7e2297-kube-api-access-bcdpg\") pod \"root-account-create-update-6585v\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.625930 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c58bd42-1add-4893-96c1-bb363f7e2297-operator-scripts\") pod \"root-account-create-update-6585v\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.628567 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c58bd42-1add-4893-96c1-bb363f7e2297-operator-scripts\") pod \"root-account-create-update-6585v\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.647125 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcdpg\" (UniqueName: \"kubernetes.io/projected/3c58bd42-1add-4893-96c1-bb363f7e2297-kube-api-access-bcdpg\") pod \"root-account-create-update-6585v\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.749263 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-w5tpt"] Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.750971 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.752963 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.757468 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-w5tpt"] Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.905830 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6585v" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.930906 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-svc\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.930982 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-swift-storage-0\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.931014 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-nb\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.931035 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-sb\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.931070 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db4qh\" (UniqueName: \"kubernetes.io/projected/091c8182-6d14-4d5b-be74-afa07b3d201f-kube-api-access-db4qh\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:50 crc kubenswrapper[4799]: I0319 20:22:50.931095 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-config\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.032694 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-swift-storage-0\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.032965 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-nb\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.032990 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-sb\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.033027 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db4qh\" (UniqueName: \"kubernetes.io/projected/091c8182-6d14-4d5b-be74-afa07b3d201f-kube-api-access-db4qh\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.033058 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-config\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.033108 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-svc\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.034104 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-svc\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.034111 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-config\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.034171 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-sb\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.034227 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-nb\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.034514 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-swift-storage-0\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.051572 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db4qh\" (UniqueName: \"kubernetes.io/projected/091c8182-6d14-4d5b-be74-afa07b3d201f-kube-api-access-db4qh\") pod \"dnsmasq-dns-86cbdd8bfc-w5tpt\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.104375 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.362670 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-w5tpt"] Mar 19 20:22:51 crc kubenswrapper[4799]: W0319 20:22:51.377224 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod091c8182_6d14_4d5b_be74_afa07b3d201f.slice/crio-2ad81c4cb557ef3020c70ca51ffe86d573f5c55a67c455c995696642ff6a9184 WatchSource:0}: Error finding container 2ad81c4cb557ef3020c70ca51ffe86d573f5c55a67c455c995696642ff6a9184: Status 404 returned error can't find the container with id 2ad81c4cb557ef3020c70ca51ffe86d573f5c55a67c455c995696642ff6a9184 Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.382663 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6585v"] Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.406255 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" event={"ID":"091c8182-6d14-4d5b-be74-afa07b3d201f","Type":"ContainerStarted","Data":"2ad81c4cb557ef3020c70ca51ffe86d573f5c55a67c455c995696642ff6a9184"} Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.407842 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6585v" event={"ID":"3c58bd42-1add-4893-96c1-bb363f7e2297","Type":"ContainerStarted","Data":"8a1b3283ee187de2867ff886c3b436a786dd75005a3fde4207fb4fdc19768a97"} Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.410656 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"749a043f-5262-416d-b639-9ff8fdcf7f12","Type":"ContainerStarted","Data":"f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb"} Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.410886 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.415198 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ee15a17-4d32-468e-8a57-2a597cebd850","Type":"ContainerStarted","Data":"2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d"} Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.415991 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.433796 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.094152442 podStartE2EDuration="1m4.433778861s" podCreationTimestamp="2026-03-19 20:21:47 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.388827763 +0000 UTC m=+1005.994780825" lastFinishedPulling="2026-03-19 20:22:16.728454172 +0000 UTC m=+1014.334407244" observedRunningTime="2026-03-19 20:22:51.433290147 +0000 UTC m=+1049.039243219" watchObservedRunningTime="2026-03-19 20:22:51.433778861 +0000 UTC m=+1049.039731933" Mar 19 20:22:51 crc kubenswrapper[4799]: I0319 20:22:51.464357 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.924017034 podStartE2EDuration="1m4.464338439s" podCreationTimestamp="2026-03-19 20:21:47 +0000 UTC" firstStartedPulling="2026-03-19 20:22:08.309274441 +0000 UTC m=+1005.915227513" lastFinishedPulling="2026-03-19 20:22:16.849595846 +0000 UTC m=+1014.455548918" observedRunningTime="2026-03-19 20:22:51.462286903 +0000 UTC m=+1049.068239985" watchObservedRunningTime="2026-03-19 20:22:51.464338439 +0000 UTC m=+1049.070291511" Mar 19 20:22:52 crc kubenswrapper[4799]: I0319 20:22:52.433323 4799 generic.go:334] "Generic (PLEG): container finished" podID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerID="3538bcede0a47f2d33ec71e8a9e32481c8848678421be9c85626d924f1bc3950" exitCode=0 Mar 19 20:22:52 crc kubenswrapper[4799]: I0319 20:22:52.433414 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" event={"ID":"091c8182-6d14-4d5b-be74-afa07b3d201f","Type":"ContainerDied","Data":"3538bcede0a47f2d33ec71e8a9e32481c8848678421be9c85626d924f1bc3950"} Mar 19 20:22:52 crc kubenswrapper[4799]: I0319 20:22:52.435962 4799 generic.go:334] "Generic (PLEG): container finished" podID="3c58bd42-1add-4893-96c1-bb363f7e2297" containerID="1f4655b9587148fdd509f653e27f24523ab1c2e1c8436e661eee2bdf6afa7405" exitCode=0 Mar 19 20:22:52 crc kubenswrapper[4799]: I0319 20:22:52.436031 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6585v" event={"ID":"3c58bd42-1add-4893-96c1-bb363f7e2297","Type":"ContainerDied","Data":"1f4655b9587148fdd509f653e27f24523ab1c2e1c8436e661eee2bdf6afa7405"} Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.445648 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" event={"ID":"091c8182-6d14-4d5b-be74-afa07b3d201f","Type":"ContainerStarted","Data":"732739c5eb90e7745fe42dbb831e94711711466b8cd8ab562b9e153e33dd7149"} Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.446111 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.475435 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" podStartSLOduration=3.475413095 podStartE2EDuration="3.475413095s" podCreationTimestamp="2026-03-19 20:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:22:53.471666782 +0000 UTC m=+1051.077619874" watchObservedRunningTime="2026-03-19 20:22:53.475413095 +0000 UTC m=+1051.081366177" Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.712798 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.713779 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-h89k2" podUID="cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8" containerName="ovn-controller" probeResult="failure" output=< Mar 19 20:22:53 crc kubenswrapper[4799]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 20:22:53 crc kubenswrapper[4799]: > Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.732484 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dcj2n" Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.957116 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-h89k2-config-b8f7m"] Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.958333 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.960661 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 20:22:53 crc kubenswrapper[4799]: I0319 20:22:53.968227 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h89k2-config-b8f7m"] Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.088278 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run-ovn\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.088328 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-scripts\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.088401 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-additional-scripts\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.088442 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-log-ovn\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.088471 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.088496 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5gb\" (UniqueName: \"kubernetes.io/projected/1a41132e-c5b8-4d93-b330-796fb73e0ede-kube-api-access-bz5gb\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189350 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-log-ovn\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189422 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189453 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5gb\" (UniqueName: \"kubernetes.io/projected/1a41132e-c5b8-4d93-b330-796fb73e0ede-kube-api-access-bz5gb\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189504 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run-ovn\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189524 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-scripts\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189882 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run-ovn\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189892 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.189933 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-log-ovn\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.190683 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-additional-scripts\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.191419 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-additional-scripts\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.192978 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-scripts\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.209636 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5gb\" (UniqueName: \"kubernetes.io/projected/1a41132e-c5b8-4d93-b330-796fb73e0ede-kube-api-access-bz5gb\") pod \"ovn-controller-h89k2-config-b8f7m\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:54 crc kubenswrapper[4799]: I0319 20:22:54.288199 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:22:58 crc kubenswrapper[4799]: I0319 20:22:58.669316 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-h89k2" podUID="cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8" containerName="ovn-controller" probeResult="failure" output=< Mar 19 20:22:58 crc kubenswrapper[4799]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 20:22:58 crc kubenswrapper[4799]: > Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.509823 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6585v" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.541339 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6585v" event={"ID":"3c58bd42-1add-4893-96c1-bb363f7e2297","Type":"ContainerDied","Data":"8a1b3283ee187de2867ff886c3b436a786dd75005a3fde4207fb4fdc19768a97"} Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.541376 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a1b3283ee187de2867ff886c3b436a786dd75005a3fde4207fb4fdc19768a97" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.541405 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6585v" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.608782 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcdpg\" (UniqueName: \"kubernetes.io/projected/3c58bd42-1add-4893-96c1-bb363f7e2297-kube-api-access-bcdpg\") pod \"3c58bd42-1add-4893-96c1-bb363f7e2297\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.608880 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c58bd42-1add-4893-96c1-bb363f7e2297-operator-scripts\") pod \"3c58bd42-1add-4893-96c1-bb363f7e2297\" (UID: \"3c58bd42-1add-4893-96c1-bb363f7e2297\") " Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.610869 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c58bd42-1add-4893-96c1-bb363f7e2297-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c58bd42-1add-4893-96c1-bb363f7e2297" (UID: "3c58bd42-1add-4893-96c1-bb363f7e2297"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.617575 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c58bd42-1add-4893-96c1-bb363f7e2297-kube-api-access-bcdpg" (OuterVolumeSpecName: "kube-api-access-bcdpg") pod "3c58bd42-1add-4893-96c1-bb363f7e2297" (UID: "3c58bd42-1add-4893-96c1-bb363f7e2297"). InnerVolumeSpecName "kube-api-access-bcdpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.710417 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcdpg\" (UniqueName: \"kubernetes.io/projected/3c58bd42-1add-4893-96c1-bb363f7e2297-kube-api-access-bcdpg\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.710460 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c58bd42-1add-4893-96c1-bb363f7e2297-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:00 crc kubenswrapper[4799]: I0319 20:23:00.874704 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h89k2-config-b8f7m"] Mar 19 20:23:00 crc kubenswrapper[4799]: W0319 20:23:00.882714 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a41132e_c5b8_4d93_b330_796fb73e0ede.slice/crio-44bb4bc493a6549e5252b85f660b36d7de3464560deb551cf9cf10855db81dc4 WatchSource:0}: Error finding container 44bb4bc493a6549e5252b85f660b36d7de3464560deb551cf9cf10855db81dc4: Status 404 returned error can't find the container with id 44bb4bc493a6549e5252b85f660b36d7de3464560deb551cf9cf10855db81dc4 Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.106562 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.171376 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-lvbzp"] Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.171611 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" containerName="dnsmasq-dns" containerID="cri-o://5676b690020210ecb247cd185ae33755305c7ceb5fc992bfa9b28ea2f8c95f5a" gracePeriod=10 Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.557556 4799 generic.go:334] "Generic (PLEG): container finished" podID="1a41132e-c5b8-4d93-b330-796fb73e0ede" containerID="6c7e1c1307ea4516951caf7219f73245eefc35a6e86a8e373445f4d7d411f60d" exitCode=0 Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.557876 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2-config-b8f7m" event={"ID":"1a41132e-c5b8-4d93-b330-796fb73e0ede","Type":"ContainerDied","Data":"6c7e1c1307ea4516951caf7219f73245eefc35a6e86a8e373445f4d7d411f60d"} Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.557902 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2-config-b8f7m" event={"ID":"1a41132e-c5b8-4d93-b330-796fb73e0ede","Type":"ContainerStarted","Data":"44bb4bc493a6549e5252b85f660b36d7de3464560deb551cf9cf10855db81dc4"} Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.566526 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ndlfj" event={"ID":"ab9a5a1a-cacb-4ae7-a087-e50045584210","Type":"ContainerStarted","Data":"6d8d8eeb0d0e78e213af3c072a714d7de02ce7dadd70e999f11e4e93c4595d34"} Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.573479 4799 generic.go:334] "Generic (PLEG): container finished" podID="fc766b6d-f977-4050-a719-5327a3a351e0" containerID="5676b690020210ecb247cd185ae33755305c7ceb5fc992bfa9b28ea2f8c95f5a" exitCode=0 Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.573526 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" event={"ID":"fc766b6d-f977-4050-a719-5327a3a351e0","Type":"ContainerDied","Data":"5676b690020210ecb247cd185ae33755305c7ceb5fc992bfa9b28ea2f8c95f5a"} Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.573553 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" event={"ID":"fc766b6d-f977-4050-a719-5327a3a351e0","Type":"ContainerDied","Data":"403af416d84fd1a8d54fa7e5b72c12b61a927d24f057a20248533f88b83f9c23"} Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.573564 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403af416d84fd1a8d54fa7e5b72c12b61a927d24f057a20248533f88b83f9c23" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.585442 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.593715 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ndlfj" podStartSLOduration=2.900989202 podStartE2EDuration="14.59369619s" podCreationTimestamp="2026-03-19 20:22:47 +0000 UTC" firstStartedPulling="2026-03-19 20:22:48.710043482 +0000 UTC m=+1046.315996554" lastFinishedPulling="2026-03-19 20:23:00.40275046 +0000 UTC m=+1058.008703542" observedRunningTime="2026-03-19 20:23:01.591335185 +0000 UTC m=+1059.197288257" watchObservedRunningTime="2026-03-19 20:23:01.59369619 +0000 UTC m=+1059.199649282" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.628155 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-config\") pod \"fc766b6d-f977-4050-a719-5327a3a351e0\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.628227 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-nb\") pod \"fc766b6d-f977-4050-a719-5327a3a351e0\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.629101 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-dns-svc\") pod \"fc766b6d-f977-4050-a719-5327a3a351e0\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.629490 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqvxr\" (UniqueName: \"kubernetes.io/projected/fc766b6d-f977-4050-a719-5327a3a351e0-kube-api-access-fqvxr\") pod \"fc766b6d-f977-4050-a719-5327a3a351e0\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.629519 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-sb\") pod \"fc766b6d-f977-4050-a719-5327a3a351e0\" (UID: \"fc766b6d-f977-4050-a719-5327a3a351e0\") " Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.634640 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc766b6d-f977-4050-a719-5327a3a351e0-kube-api-access-fqvxr" (OuterVolumeSpecName: "kube-api-access-fqvxr") pod "fc766b6d-f977-4050-a719-5327a3a351e0" (UID: "fc766b6d-f977-4050-a719-5327a3a351e0"). InnerVolumeSpecName "kube-api-access-fqvxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.665760 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-config" (OuterVolumeSpecName: "config") pod "fc766b6d-f977-4050-a719-5327a3a351e0" (UID: "fc766b6d-f977-4050-a719-5327a3a351e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.666414 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc766b6d-f977-4050-a719-5327a3a351e0" (UID: "fc766b6d-f977-4050-a719-5327a3a351e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.675211 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc766b6d-f977-4050-a719-5327a3a351e0" (UID: "fc766b6d-f977-4050-a719-5327a3a351e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.679962 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc766b6d-f977-4050-a719-5327a3a351e0" (UID: "fc766b6d-f977-4050-a719-5327a3a351e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.731792 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.731827 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.731836 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.731846 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqvxr\" (UniqueName: \"kubernetes.io/projected/fc766b6d-f977-4050-a719-5327a3a351e0-kube-api-access-fqvxr\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:01 crc kubenswrapper[4799]: I0319 20:23:01.731855 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc766b6d-f977-4050-a719-5327a3a351e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:02 crc kubenswrapper[4799]: I0319 20:23:02.582232 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-lvbzp" Mar 19 20:23:02 crc kubenswrapper[4799]: I0319 20:23:02.625456 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-lvbzp"] Mar 19 20:23:02 crc kubenswrapper[4799]: I0319 20:23:02.633368 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-lvbzp"] Mar 19 20:23:02 crc kubenswrapper[4799]: I0319 20:23:02.966309 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056604 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run-ovn\") pod \"1a41132e-c5b8-4d93-b330-796fb73e0ede\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056665 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-additional-scripts\") pod \"1a41132e-c5b8-4d93-b330-796fb73e0ede\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056707 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run\") pod \"1a41132e-c5b8-4d93-b330-796fb73e0ede\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056718 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1a41132e-c5b8-4d93-b330-796fb73e0ede" (UID: "1a41132e-c5b8-4d93-b330-796fb73e0ede"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056734 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-scripts\") pod \"1a41132e-c5b8-4d93-b330-796fb73e0ede\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056912 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz5gb\" (UniqueName: \"kubernetes.io/projected/1a41132e-c5b8-4d93-b330-796fb73e0ede-kube-api-access-bz5gb\") pod \"1a41132e-c5b8-4d93-b330-796fb73e0ede\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.056931 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-log-ovn\") pod \"1a41132e-c5b8-4d93-b330-796fb73e0ede\" (UID: \"1a41132e-c5b8-4d93-b330-796fb73e0ede\") " Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057206 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run" (OuterVolumeSpecName: "var-run") pod "1a41132e-c5b8-4d93-b330-796fb73e0ede" (UID: "1a41132e-c5b8-4d93-b330-796fb73e0ede"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057279 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1a41132e-c5b8-4d93-b330-796fb73e0ede" (UID: "1a41132e-c5b8-4d93-b330-796fb73e0ede"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057465 4799 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057481 4799 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057494 4799 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1a41132e-c5b8-4d93-b330-796fb73e0ede-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057499 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1a41132e-c5b8-4d93-b330-796fb73e0ede" (UID: "1a41132e-c5b8-4d93-b330-796fb73e0ede"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.057984 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-scripts" (OuterVolumeSpecName: "scripts") pod "1a41132e-c5b8-4d93-b330-796fb73e0ede" (UID: "1a41132e-c5b8-4d93-b330-796fb73e0ede"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.063610 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a41132e-c5b8-4d93-b330-796fb73e0ede-kube-api-access-bz5gb" (OuterVolumeSpecName: "kube-api-access-bz5gb") pod "1a41132e-c5b8-4d93-b330-796fb73e0ede" (UID: "1a41132e-c5b8-4d93-b330-796fb73e0ede"). InnerVolumeSpecName "kube-api-access-bz5gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.129196 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" path="/var/lib/kubelet/pods/fc766b6d-f977-4050-a719-5327a3a351e0/volumes" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.159128 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz5gb\" (UniqueName: \"kubernetes.io/projected/1a41132e-c5b8-4d93-b330-796fb73e0ede-kube-api-access-bz5gb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.159156 4799 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.159167 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a41132e-c5b8-4d93-b330-796fb73e0ede-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.591222 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2-config-b8f7m" event={"ID":"1a41132e-c5b8-4d93-b330-796fb73e0ede","Type":"ContainerDied","Data":"44bb4bc493a6549e5252b85f660b36d7de3464560deb551cf9cf10855db81dc4"} Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.591599 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44bb4bc493a6549e5252b85f660b36d7de3464560deb551cf9cf10855db81dc4" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.591287 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-b8f7m" Mar 19 20:23:03 crc kubenswrapper[4799]: I0319 20:23:03.687936 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-h89k2" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.069824 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-h89k2-config-b8f7m"] Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.079455 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-h89k2-config-b8f7m"] Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.147251 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-h89k2-config-p926l"] Mar 19 20:23:04 crc kubenswrapper[4799]: E0319 20:23:04.147844 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" containerName="init" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.147919 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" containerName="init" Mar 19 20:23:04 crc kubenswrapper[4799]: E0319 20:23:04.147989 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a41132e-c5b8-4d93-b330-796fb73e0ede" containerName="ovn-config" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.148048 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a41132e-c5b8-4d93-b330-796fb73e0ede" containerName="ovn-config" Mar 19 20:23:04 crc kubenswrapper[4799]: E0319 20:23:04.148105 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c58bd42-1add-4893-96c1-bb363f7e2297" containerName="mariadb-account-create-update" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.148155 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c58bd42-1add-4893-96c1-bb363f7e2297" containerName="mariadb-account-create-update" Mar 19 20:23:04 crc kubenswrapper[4799]: E0319 20:23:04.148230 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" containerName="dnsmasq-dns" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.148282 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" containerName="dnsmasq-dns" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.148524 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a41132e-c5b8-4d93-b330-796fb73e0ede" containerName="ovn-config" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.148630 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc766b6d-f977-4050-a719-5327a3a351e0" containerName="dnsmasq-dns" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.148719 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c58bd42-1add-4893-96c1-bb363f7e2297" containerName="mariadb-account-create-update" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.149486 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.153366 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.164196 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h89k2-config-p926l"] Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.176749 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnkg4\" (UniqueName: \"kubernetes.io/projected/d39f5fee-8bab-428c-8240-36a14c147ebb-kube-api-access-vnkg4\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.176848 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run-ovn\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.176889 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-additional-scripts\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.176967 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-scripts\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.177003 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.177100 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-log-ovn\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.278527 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-scripts\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.278568 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.278631 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-log-ovn\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.278684 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnkg4\" (UniqueName: \"kubernetes.io/projected/d39f5fee-8bab-428c-8240-36a14c147ebb-kube-api-access-vnkg4\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.278719 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run-ovn\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.278739 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-additional-scripts\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.279181 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-log-ovn\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.279302 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.279359 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run-ovn\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.279323 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-additional-scripts\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.280417 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-scripts\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.297533 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnkg4\" (UniqueName: \"kubernetes.io/projected/d39f5fee-8bab-428c-8240-36a14c147ebb-kube-api-access-vnkg4\") pod \"ovn-controller-h89k2-config-p926l\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:04 crc kubenswrapper[4799]: I0319 20:23:04.471129 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:05 crc kubenswrapper[4799]: I0319 20:23:05.032022 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-h89k2-config-p926l"] Mar 19 20:23:05 crc kubenswrapper[4799]: I0319 20:23:05.155877 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a41132e-c5b8-4d93-b330-796fb73e0ede" path="/var/lib/kubelet/pods/1a41132e-c5b8-4d93-b330-796fb73e0ede/volumes" Mar 19 20:23:05 crc kubenswrapper[4799]: I0319 20:23:05.626864 4799 generic.go:334] "Generic (PLEG): container finished" podID="d39f5fee-8bab-428c-8240-36a14c147ebb" containerID="3e62a82de3b89bd879aa78f98e9b11ed16982ea312eb10fa4da0c5e2af39e3bb" exitCode=0 Mar 19 20:23:05 crc kubenswrapper[4799]: I0319 20:23:05.626919 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2-config-p926l" event={"ID":"d39f5fee-8bab-428c-8240-36a14c147ebb","Type":"ContainerDied","Data":"3e62a82de3b89bd879aa78f98e9b11ed16982ea312eb10fa4da0c5e2af39e3bb"} Mar 19 20:23:05 crc kubenswrapper[4799]: I0319 20:23:05.626994 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2-config-p926l" event={"ID":"d39f5fee-8bab-428c-8240-36a14c147ebb","Type":"ContainerStarted","Data":"952ed88723b8243552a45a68963158e9d671789dae8e438b39abf5318928a135"} Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.028437 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.128893 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnkg4\" (UniqueName: \"kubernetes.io/projected/d39f5fee-8bab-428c-8240-36a14c147ebb-kube-api-access-vnkg4\") pod \"d39f5fee-8bab-428c-8240-36a14c147ebb\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129040 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-log-ovn\") pod \"d39f5fee-8bab-428c-8240-36a14c147ebb\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129127 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d39f5fee-8bab-428c-8240-36a14c147ebb" (UID: "d39f5fee-8bab-428c-8240-36a14c147ebb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129183 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-scripts\") pod \"d39f5fee-8bab-428c-8240-36a14c147ebb\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129449 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run" (OuterVolumeSpecName: "var-run") pod "d39f5fee-8bab-428c-8240-36a14c147ebb" (UID: "d39f5fee-8bab-428c-8240-36a14c147ebb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129534 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run\") pod \"d39f5fee-8bab-428c-8240-36a14c147ebb\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129622 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run-ovn\") pod \"d39f5fee-8bab-428c-8240-36a14c147ebb\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129685 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d39f5fee-8bab-428c-8240-36a14c147ebb" (UID: "d39f5fee-8bab-428c-8240-36a14c147ebb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.129764 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-additional-scripts\") pod \"d39f5fee-8bab-428c-8240-36a14c147ebb\" (UID: \"d39f5fee-8bab-428c-8240-36a14c147ebb\") " Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.130488 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d39f5fee-8bab-428c-8240-36a14c147ebb" (UID: "d39f5fee-8bab-428c-8240-36a14c147ebb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.130801 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-scripts" (OuterVolumeSpecName: "scripts") pod "d39f5fee-8bab-428c-8240-36a14c147ebb" (UID: "d39f5fee-8bab-428c-8240-36a14c147ebb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.131474 4799 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.131529 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.131544 4799 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.131560 4799 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d39f5fee-8bab-428c-8240-36a14c147ebb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.131575 4799 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d39f5fee-8bab-428c-8240-36a14c147ebb-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.138320 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39f5fee-8bab-428c-8240-36a14c147ebb-kube-api-access-vnkg4" (OuterVolumeSpecName: "kube-api-access-vnkg4") pod "d39f5fee-8bab-428c-8240-36a14c147ebb" (UID: "d39f5fee-8bab-428c-8240-36a14c147ebb"). InnerVolumeSpecName "kube-api-access-vnkg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.233626 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnkg4\" (UniqueName: \"kubernetes.io/projected/d39f5fee-8bab-428c-8240-36a14c147ebb-kube-api-access-vnkg4\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.651434 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-h89k2-config-p926l" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.651425 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-h89k2-config-p926l" event={"ID":"d39f5fee-8bab-428c-8240-36a14c147ebb","Type":"ContainerDied","Data":"952ed88723b8243552a45a68963158e9d671789dae8e438b39abf5318928a135"} Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.651628 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952ed88723b8243552a45a68963158e9d671789dae8e438b39abf5318928a135" Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.655009 4799 generic.go:334] "Generic (PLEG): container finished" podID="ab9a5a1a-cacb-4ae7-a087-e50045584210" containerID="6d8d8eeb0d0e78e213af3c072a714d7de02ce7dadd70e999f11e4e93c4595d34" exitCode=0 Mar 19 20:23:07 crc kubenswrapper[4799]: I0319 20:23:07.655076 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ndlfj" event={"ID":"ab9a5a1a-cacb-4ae7-a087-e50045584210","Type":"ContainerDied","Data":"6d8d8eeb0d0e78e213af3c072a714d7de02ce7dadd70e999f11e4e93c4595d34"} Mar 19 20:23:08 crc kubenswrapper[4799]: I0319 20:23:08.144173 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-h89k2-config-p926l"] Mar 19 20:23:08 crc kubenswrapper[4799]: I0319 20:23:08.167662 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-h89k2-config-p926l"] Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.082576 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ndlfj" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.131755 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39f5fee-8bab-428c-8240-36a14c147ebb" path="/var/lib/kubelet/pods/d39f5fee-8bab-428c-8240-36a14c147ebb/volumes" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.169916 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnm75\" (UniqueName: \"kubernetes.io/projected/ab9a5a1a-cacb-4ae7-a087-e50045584210-kube-api-access-qnm75\") pod \"ab9a5a1a-cacb-4ae7-a087-e50045584210\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.169988 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-db-sync-config-data\") pod \"ab9a5a1a-cacb-4ae7-a087-e50045584210\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.170127 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-combined-ca-bundle\") pod \"ab9a5a1a-cacb-4ae7-a087-e50045584210\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.170185 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-config-data\") pod \"ab9a5a1a-cacb-4ae7-a087-e50045584210\" (UID: \"ab9a5a1a-cacb-4ae7-a087-e50045584210\") " Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.177810 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9a5a1a-cacb-4ae7-a087-e50045584210-kube-api-access-qnm75" (OuterVolumeSpecName: "kube-api-access-qnm75") pod "ab9a5a1a-cacb-4ae7-a087-e50045584210" (UID: "ab9a5a1a-cacb-4ae7-a087-e50045584210"). InnerVolumeSpecName "kube-api-access-qnm75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.178185 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ab9a5a1a-cacb-4ae7-a087-e50045584210" (UID: "ab9a5a1a-cacb-4ae7-a087-e50045584210"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.225854 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab9a5a1a-cacb-4ae7-a087-e50045584210" (UID: "ab9a5a1a-cacb-4ae7-a087-e50045584210"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.228029 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-config-data" (OuterVolumeSpecName: "config-data") pod "ab9a5a1a-cacb-4ae7-a087-e50045584210" (UID: "ab9a5a1a-cacb-4ae7-a087-e50045584210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.253655 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.276254 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnm75\" (UniqueName: \"kubernetes.io/projected/ab9a5a1a-cacb-4ae7-a087-e50045584210-kube-api-access-qnm75\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.276318 4799 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.276333 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.276345 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9a5a1a-cacb-4ae7-a087-e50045584210-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.324548 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.684639 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ndlfj" event={"ID":"ab9a5a1a-cacb-4ae7-a087-e50045584210","Type":"ContainerDied","Data":"a946acea1d5911609fca89283788f72d1c9dfab6bfd98d2d59567a3aa6e63ff5"} Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.684686 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ndlfj" Mar 19 20:23:09 crc kubenswrapper[4799]: I0319 20:23:09.684699 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a946acea1d5911609fca89283788f72d1c9dfab6bfd98d2d59567a3aa6e63ff5" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.100120 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-hr57v"] Mar 19 20:23:10 crc kubenswrapper[4799]: E0319 20:23:10.109889 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9a5a1a-cacb-4ae7-a087-e50045584210" containerName="glance-db-sync" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.109907 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9a5a1a-cacb-4ae7-a087-e50045584210" containerName="glance-db-sync" Mar 19 20:23:10 crc kubenswrapper[4799]: E0319 20:23:10.109966 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39f5fee-8bab-428c-8240-36a14c147ebb" containerName="ovn-config" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.109974 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39f5fee-8bab-428c-8240-36a14c147ebb" containerName="ovn-config" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.110137 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39f5fee-8bab-428c-8240-36a14c147ebb" containerName="ovn-config" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.110161 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9a5a1a-cacb-4ae7-a087-e50045584210" containerName="glance-db-sync" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.110922 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.133749 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-hr57v"] Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.192603 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9cp\" (UniqueName: \"kubernetes.io/projected/a2923901-dc23-464a-871f-85d66cebadb1-kube-api-access-zd9cp\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.192639 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-nb\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.192682 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-svc\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.192710 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-config\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.192742 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-swift-storage-0\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.192790 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-sb\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.294503 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9cp\" (UniqueName: \"kubernetes.io/projected/a2923901-dc23-464a-871f-85d66cebadb1-kube-api-access-zd9cp\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.294554 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-nb\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.294585 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-svc\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.294615 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-config\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.294647 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-swift-storage-0\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.294685 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-sb\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.295595 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-sb\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.295657 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-svc\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.296156 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-config\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.296201 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-nb\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.296663 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-swift-storage-0\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.314311 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9cp\" (UniqueName: \"kubernetes.io/projected/a2923901-dc23-464a-871f-85d66cebadb1-kube-api-access-zd9cp\") pod \"dnsmasq-dns-6d88577c8c-hr57v\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.448498 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:10 crc kubenswrapper[4799]: W0319 20:23:10.991817 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2923901_dc23_464a_871f_85d66cebadb1.slice/crio-c92660e1be29e6fe60eb2882b3dfa9b55f442444d54fa1316fbd7e2e37f5fca1 WatchSource:0}: Error finding container c92660e1be29e6fe60eb2882b3dfa9b55f442444d54fa1316fbd7e2e37f5fca1: Status 404 returned error can't find the container with id c92660e1be29e6fe60eb2882b3dfa9b55f442444d54fa1316fbd7e2e37f5fca1 Mar 19 20:23:10 crc kubenswrapper[4799]: I0319 20:23:10.992482 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-hr57v"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.414827 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h8b7k"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.415930 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.427817 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h8b7k"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.511254 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7kg\" (UniqueName: \"kubernetes.io/projected/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-kube-api-access-8f7kg\") pod \"cinder-db-create-h8b7k\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.511336 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-operator-scripts\") pod \"cinder-db-create-h8b7k\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.525656 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-42a2-account-create-update-c2ghz"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.526696 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.530180 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.536129 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-42a2-account-create-update-c2ghz"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.612369 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-operator-scripts\") pod \"cinder-db-create-h8b7k\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.612472 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgffj\" (UniqueName: \"kubernetes.io/projected/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-kube-api-access-sgffj\") pod \"cinder-42a2-account-create-update-c2ghz\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.612516 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-operator-scripts\") pod \"cinder-42a2-account-create-update-c2ghz\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.612532 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7kg\" (UniqueName: \"kubernetes.io/projected/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-kube-api-access-8f7kg\") pod \"cinder-db-create-h8b7k\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.613178 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-operator-scripts\") pod \"cinder-db-create-h8b7k\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.621985 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-gk65m"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.622898 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.630980 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gk65m"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.633955 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7kg\" (UniqueName: \"kubernetes.io/projected/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-kube-api-access-8f7kg\") pod \"cinder-db-create-h8b7k\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.713754 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgffj\" (UniqueName: \"kubernetes.io/projected/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-kube-api-access-sgffj\") pod \"cinder-42a2-account-create-update-c2ghz\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.713814 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-operator-scripts\") pod \"cinder-42a2-account-create-update-c2ghz\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.713837 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee26f4f-790a-479f-b0b0-23c17a5aa642-operator-scripts\") pod \"barbican-db-create-gk65m\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.713865 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94mk8\" (UniqueName: \"kubernetes.io/projected/2ee26f4f-790a-479f-b0b0-23c17a5aa642-kube-api-access-94mk8\") pod \"barbican-db-create-gk65m\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.714540 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-operator-scripts\") pod \"cinder-42a2-account-create-update-c2ghz\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.728047 4799 generic.go:334] "Generic (PLEG): container finished" podID="a2923901-dc23-464a-871f-85d66cebadb1" containerID="ae2e6cd8f1c327fd69f6d8cbd30bc5dee7e83ab32533d4d5b31fccf0a59a0ae7" exitCode=0 Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.728090 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" event={"ID":"a2923901-dc23-464a-871f-85d66cebadb1","Type":"ContainerDied","Data":"ae2e6cd8f1c327fd69f6d8cbd30bc5dee7e83ab32533d4d5b31fccf0a59a0ae7"} Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.728117 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" event={"ID":"a2923901-dc23-464a-871f-85d66cebadb1","Type":"ContainerStarted","Data":"c92660e1be29e6fe60eb2882b3dfa9b55f442444d54fa1316fbd7e2e37f5fca1"} Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.730839 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.747139 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5219-account-create-update-bp54l"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.748242 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.752340 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgffj\" (UniqueName: \"kubernetes.io/projected/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-kube-api-access-sgffj\") pod \"cinder-42a2-account-create-update-c2ghz\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.754992 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.773289 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5219-account-create-update-bp54l"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.815975 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94mk8\" (UniqueName: \"kubernetes.io/projected/2ee26f4f-790a-479f-b0b0-23c17a5aa642-kube-api-access-94mk8\") pod \"barbican-db-create-gk65m\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.816053 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9661e457-f424-4642-8551-c61fc2924ae9-operator-scripts\") pod \"barbican-5219-account-create-update-bp54l\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.816121 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dbq\" (UniqueName: \"kubernetes.io/projected/9661e457-f424-4642-8551-c61fc2924ae9-kube-api-access-74dbq\") pod \"barbican-5219-account-create-update-bp54l\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.816232 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee26f4f-790a-479f-b0b0-23c17a5aa642-operator-scripts\") pod \"barbican-db-create-gk65m\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.816957 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee26f4f-790a-479f-b0b0-23c17a5aa642-operator-scripts\") pod \"barbican-db-create-gk65m\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.828497 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4492r"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.829463 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.838407 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.838514 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.838768 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.839157 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vdqvk" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.839561 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.861006 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4492r"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.871615 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94mk8\" (UniqueName: \"kubernetes.io/projected/2ee26f4f-790a-479f-b0b0-23c17a5aa642-kube-api-access-94mk8\") pod \"barbican-db-create-gk65m\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.871789 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ncd8j"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.872794 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.898826 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ncd8j"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.916917 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblkz\" (UniqueName: \"kubernetes.io/projected/900c3adf-1008-43dd-a517-eae371754fcd-kube-api-access-fblkz\") pod \"neutron-db-create-ncd8j\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.916987 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-config-data\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.917035 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bg2b\" (UniqueName: \"kubernetes.io/projected/c863d9fd-be20-4dfd-992e-02af944c3382-kube-api-access-2bg2b\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.917059 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c3adf-1008-43dd-a517-eae371754fcd-operator-scripts\") pod \"neutron-db-create-ncd8j\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.917090 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9661e457-f424-4642-8551-c61fc2924ae9-operator-scripts\") pod \"barbican-5219-account-create-update-bp54l\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.917140 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dbq\" (UniqueName: \"kubernetes.io/projected/9661e457-f424-4642-8551-c61fc2924ae9-kube-api-access-74dbq\") pod \"barbican-5219-account-create-update-bp54l\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.917183 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-combined-ca-bundle\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.918085 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9661e457-f424-4642-8551-c61fc2924ae9-operator-scripts\") pod \"barbican-5219-account-create-update-bp54l\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.937315 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.942249 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8945-account-create-update-kldff"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.946495 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dbq\" (UniqueName: \"kubernetes.io/projected/9661e457-f424-4642-8551-c61fc2924ae9-kube-api-access-74dbq\") pod \"barbican-5219-account-create-update-bp54l\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.955204 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.960491 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8945-account-create-update-kldff"] Mar 19 20:23:11 crc kubenswrapper[4799]: I0319 20:23:11.960556 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017648 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvrct\" (UniqueName: \"kubernetes.io/projected/3426fb78-4e09-4ddf-936d-48509c174c3f-kube-api-access-bvrct\") pod \"neutron-8945-account-create-update-kldff\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017698 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-combined-ca-bundle\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017764 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fblkz\" (UniqueName: \"kubernetes.io/projected/900c3adf-1008-43dd-a517-eae371754fcd-kube-api-access-fblkz\") pod \"neutron-db-create-ncd8j\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017784 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3426fb78-4e09-4ddf-936d-48509c174c3f-operator-scripts\") pod \"neutron-8945-account-create-update-kldff\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017803 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-config-data\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017833 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bg2b\" (UniqueName: \"kubernetes.io/projected/c863d9fd-be20-4dfd-992e-02af944c3382-kube-api-access-2bg2b\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.017854 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c3adf-1008-43dd-a517-eae371754fcd-operator-scripts\") pod \"neutron-db-create-ncd8j\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.018527 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c3adf-1008-43dd-a517-eae371754fcd-operator-scripts\") pod \"neutron-db-create-ncd8j\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.023928 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-combined-ca-bundle\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.025220 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-config-data\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.036922 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bg2b\" (UniqueName: \"kubernetes.io/projected/c863d9fd-be20-4dfd-992e-02af944c3382-kube-api-access-2bg2b\") pod \"keystone-db-sync-4492r\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.037504 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblkz\" (UniqueName: \"kubernetes.io/projected/900c3adf-1008-43dd-a517-eae371754fcd-kube-api-access-fblkz\") pod \"neutron-db-create-ncd8j\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.119485 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3426fb78-4e09-4ddf-936d-48509c174c3f-operator-scripts\") pod \"neutron-8945-account-create-update-kldff\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.119586 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvrct\" (UniqueName: \"kubernetes.io/projected/3426fb78-4e09-4ddf-936d-48509c174c3f-kube-api-access-bvrct\") pod \"neutron-8945-account-create-update-kldff\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.121276 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3426fb78-4e09-4ddf-936d-48509c174c3f-operator-scripts\") pod \"neutron-8945-account-create-update-kldff\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.137304 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvrct\" (UniqueName: \"kubernetes.io/projected/3426fb78-4e09-4ddf-936d-48509c174c3f-kube-api-access-bvrct\") pod \"neutron-8945-account-create-update-kldff\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.181534 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.195856 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.215738 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.318824 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.371663 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gk65m"] Mar 19 20:23:12 crc kubenswrapper[4799]: W0319 20:23:12.423881 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee26f4f_790a_479f_b0b0_23c17a5aa642.slice/crio-ff8c3fcf65568577b367e2b81104906877d25f96a490bac8cbb46ac043a19783 WatchSource:0}: Error finding container ff8c3fcf65568577b367e2b81104906877d25f96a490bac8cbb46ac043a19783: Status 404 returned error can't find the container with id ff8c3fcf65568577b367e2b81104906877d25f96a490bac8cbb46ac043a19783 Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.453982 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h8b7k"] Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.479144 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-42a2-account-create-update-c2ghz"] Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.736885 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gk65m" event={"ID":"2ee26f4f-790a-479f-b0b0-23c17a5aa642","Type":"ContainerStarted","Data":"ff8c3fcf65568577b367e2b81104906877d25f96a490bac8cbb46ac043a19783"} Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.740328 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" event={"ID":"a2923901-dc23-464a-871f-85d66cebadb1","Type":"ContainerStarted","Data":"fc53f21c29cf93cfa43596a855e36cb873eecf948b3252d5cdbbbc2309d21d27"} Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.740742 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.746838 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8b7k" event={"ID":"b2e9a78e-8ff2-4edb-9ecc-35d304090da4","Type":"ContainerStarted","Data":"e3465fa373156a7520dca9ad3c017819e78b2acfa764a246f49981dbcb9d553b"} Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.748209 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-42a2-account-create-update-c2ghz" event={"ID":"d5ae96d4-a792-41df-85f7-3cf044fa8e0c","Type":"ContainerStarted","Data":"477e9316d9b60e3ad7c7e9828fb596715075af04447ad39f16c87d689849bec5"} Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.772431 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" podStartSLOduration=2.772412552 podStartE2EDuration="2.772412552s" podCreationTimestamp="2026-03-19 20:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:12.762932732 +0000 UTC m=+1070.368885824" watchObservedRunningTime="2026-03-19 20:23:12.772412552 +0000 UTC m=+1070.378365624" Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.959421 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5219-account-create-update-bp54l"] Mar 19 20:23:12 crc kubenswrapper[4799]: I0319 20:23:12.976592 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ncd8j"] Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.195506 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4492r"] Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.195533 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8945-account-create-update-kldff"] Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.766182 4799 generic.go:334] "Generic (PLEG): container finished" podID="d5ae96d4-a792-41df-85f7-3cf044fa8e0c" containerID="b7b2b9aa30e71a3e51d52b973fb29cec97edd2922a6c017971c193609d030ad7" exitCode=0 Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.766630 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-42a2-account-create-update-c2ghz" event={"ID":"d5ae96d4-a792-41df-85f7-3cf044fa8e0c","Type":"ContainerDied","Data":"b7b2b9aa30e71a3e51d52b973fb29cec97edd2922a6c017971c193609d030ad7"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.768714 4799 generic.go:334] "Generic (PLEG): container finished" podID="900c3adf-1008-43dd-a517-eae371754fcd" containerID="c57a7999935067c6e701e3fbce224781768447df5a40b3cd764d75de4e7f161d" exitCode=0 Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.768802 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ncd8j" event={"ID":"900c3adf-1008-43dd-a517-eae371754fcd","Type":"ContainerDied","Data":"c57a7999935067c6e701e3fbce224781768447df5a40b3cd764d75de4e7f161d"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.768836 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ncd8j" event={"ID":"900c3adf-1008-43dd-a517-eae371754fcd","Type":"ContainerStarted","Data":"6ed1c9f4fc50c686081ec9f76357d0588da23641b902e47bf4438cc604d01237"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.771103 4799 generic.go:334] "Generic (PLEG): container finished" podID="3426fb78-4e09-4ddf-936d-48509c174c3f" containerID="2e4500b604fe806cfd64ffc9c29b6f158bb4074f139e879b21f76b4553692917" exitCode=0 Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.771144 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8945-account-create-update-kldff" event={"ID":"3426fb78-4e09-4ddf-936d-48509c174c3f","Type":"ContainerDied","Data":"2e4500b604fe806cfd64ffc9c29b6f158bb4074f139e879b21f76b4553692917"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.771158 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8945-account-create-update-kldff" event={"ID":"3426fb78-4e09-4ddf-936d-48509c174c3f","Type":"ContainerStarted","Data":"4782fcb14b8ace9adc96a19d5dc47d33523f554dbd20f84111ba153bfbaa8332"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.773065 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ee26f4f-790a-479f-b0b0-23c17a5aa642" containerID="29b3738d832bb9ea44f3bcbcd04611e7a98e5750bc14d9f536bae35fb20335a5" exitCode=0 Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.773097 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gk65m" event={"ID":"2ee26f4f-790a-479f-b0b0-23c17a5aa642","Type":"ContainerDied","Data":"29b3738d832bb9ea44f3bcbcd04611e7a98e5750bc14d9f536bae35fb20335a5"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.773857 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4492r" event={"ID":"c863d9fd-be20-4dfd-992e-02af944c3382","Type":"ContainerStarted","Data":"32ba0713842120339cbf7747cbb3db877fd3b106ca1d189f8c94a0e39eb3da36"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.774833 4799 generic.go:334] "Generic (PLEG): container finished" podID="9661e457-f424-4642-8551-c61fc2924ae9" containerID="6df65e7b4857d3d91b74ceb2058f5fc0447fb8fa7e4dd4ccc77e917c2932b070" exitCode=0 Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.774861 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5219-account-create-update-bp54l" event={"ID":"9661e457-f424-4642-8551-c61fc2924ae9","Type":"ContainerDied","Data":"6df65e7b4857d3d91b74ceb2058f5fc0447fb8fa7e4dd4ccc77e917c2932b070"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.774874 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5219-account-create-update-bp54l" event={"ID":"9661e457-f424-4642-8551-c61fc2924ae9","Type":"ContainerStarted","Data":"1fc786835807f8bac42b07f9acfab3101f1161962b98559ceee5fc5ea6465499"} Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.777550 4799 generic.go:334] "Generic (PLEG): container finished" podID="b2e9a78e-8ff2-4edb-9ecc-35d304090da4" containerID="42d8edaaeccfdd85602d0f86893e604fa36f83f851a07adf8de1cc989c54d20b" exitCode=0 Mar 19 20:23:13 crc kubenswrapper[4799]: I0319 20:23:13.778333 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8b7k" event={"ID":"b2e9a78e-8ff2-4edb-9ecc-35d304090da4","Type":"ContainerDied","Data":"42d8edaaeccfdd85602d0f86893e604fa36f83f851a07adf8de1cc989c54d20b"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.178916 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.275184 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee26f4f-790a-479f-b0b0-23c17a5aa642-operator-scripts\") pod \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.275512 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94mk8\" (UniqueName: \"kubernetes.io/projected/2ee26f4f-790a-479f-b0b0-23c17a5aa642-kube-api-access-94mk8\") pod \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\" (UID: \"2ee26f4f-790a-479f-b0b0-23c17a5aa642\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.276172 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee26f4f-790a-479f-b0b0-23c17a5aa642-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ee26f4f-790a-479f-b0b0-23c17a5aa642" (UID: "2ee26f4f-790a-479f-b0b0-23c17a5aa642"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.282530 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee26f4f-790a-479f-b0b0-23c17a5aa642-kube-api-access-94mk8" (OuterVolumeSpecName: "kube-api-access-94mk8") pod "2ee26f4f-790a-479f-b0b0-23c17a5aa642" (UID: "2ee26f4f-790a-479f-b0b0-23c17a5aa642"). InnerVolumeSpecName "kube-api-access-94mk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.377843 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee26f4f-790a-479f-b0b0-23c17a5aa642-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.377876 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94mk8\" (UniqueName: \"kubernetes.io/projected/2ee26f4f-790a-479f-b0b0-23c17a5aa642-kube-api-access-94mk8\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.413940 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.419267 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.432696 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.437260 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.462462 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.579933 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74dbq\" (UniqueName: \"kubernetes.io/projected/9661e457-f424-4642-8551-c61fc2924ae9-kube-api-access-74dbq\") pod \"9661e457-f424-4642-8551-c61fc2924ae9\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580056 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3426fb78-4e09-4ddf-936d-48509c174c3f-operator-scripts\") pod \"3426fb78-4e09-4ddf-936d-48509c174c3f\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580089 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgffj\" (UniqueName: \"kubernetes.io/projected/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-kube-api-access-sgffj\") pod \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580116 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9661e457-f424-4642-8551-c61fc2924ae9-operator-scripts\") pod \"9661e457-f424-4642-8551-c61fc2924ae9\" (UID: \"9661e457-f424-4642-8551-c61fc2924ae9\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580180 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-operator-scripts\") pod \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580213 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f7kg\" (UniqueName: \"kubernetes.io/projected/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-kube-api-access-8f7kg\") pod \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\" (UID: \"b2e9a78e-8ff2-4edb-9ecc-35d304090da4\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580239 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c3adf-1008-43dd-a517-eae371754fcd-operator-scripts\") pod \"900c3adf-1008-43dd-a517-eae371754fcd\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580284 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fblkz\" (UniqueName: \"kubernetes.io/projected/900c3adf-1008-43dd-a517-eae371754fcd-kube-api-access-fblkz\") pod \"900c3adf-1008-43dd-a517-eae371754fcd\" (UID: \"900c3adf-1008-43dd-a517-eae371754fcd\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580315 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvrct\" (UniqueName: \"kubernetes.io/projected/3426fb78-4e09-4ddf-936d-48509c174c3f-kube-api-access-bvrct\") pod \"3426fb78-4e09-4ddf-936d-48509c174c3f\" (UID: \"3426fb78-4e09-4ddf-936d-48509c174c3f\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580338 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-operator-scripts\") pod \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\" (UID: \"d5ae96d4-a792-41df-85f7-3cf044fa8e0c\") " Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580554 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3426fb78-4e09-4ddf-936d-48509c174c3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3426fb78-4e09-4ddf-936d-48509c174c3f" (UID: "3426fb78-4e09-4ddf-936d-48509c174c3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580742 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3426fb78-4e09-4ddf-936d-48509c174c3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.580970 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2e9a78e-8ff2-4edb-9ecc-35d304090da4" (UID: "b2e9a78e-8ff2-4edb-9ecc-35d304090da4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.581013 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9661e457-f424-4642-8551-c61fc2924ae9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9661e457-f424-4642-8551-c61fc2924ae9" (UID: "9661e457-f424-4642-8551-c61fc2924ae9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.581479 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5ae96d4-a792-41df-85f7-3cf044fa8e0c" (UID: "d5ae96d4-a792-41df-85f7-3cf044fa8e0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.581496 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900c3adf-1008-43dd-a517-eae371754fcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "900c3adf-1008-43dd-a517-eae371754fcd" (UID: "900c3adf-1008-43dd-a517-eae371754fcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.582981 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-kube-api-access-sgffj" (OuterVolumeSpecName: "kube-api-access-sgffj") pod "d5ae96d4-a792-41df-85f7-3cf044fa8e0c" (UID: "d5ae96d4-a792-41df-85f7-3cf044fa8e0c"). InnerVolumeSpecName "kube-api-access-sgffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.584585 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3426fb78-4e09-4ddf-936d-48509c174c3f-kube-api-access-bvrct" (OuterVolumeSpecName: "kube-api-access-bvrct") pod "3426fb78-4e09-4ddf-936d-48509c174c3f" (UID: "3426fb78-4e09-4ddf-936d-48509c174c3f"). InnerVolumeSpecName "kube-api-access-bvrct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.584624 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9661e457-f424-4642-8551-c61fc2924ae9-kube-api-access-74dbq" (OuterVolumeSpecName: "kube-api-access-74dbq") pod "9661e457-f424-4642-8551-c61fc2924ae9" (UID: "9661e457-f424-4642-8551-c61fc2924ae9"). InnerVolumeSpecName "kube-api-access-74dbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.585182 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900c3adf-1008-43dd-a517-eae371754fcd-kube-api-access-fblkz" (OuterVolumeSpecName: "kube-api-access-fblkz") pod "900c3adf-1008-43dd-a517-eae371754fcd" (UID: "900c3adf-1008-43dd-a517-eae371754fcd"). InnerVolumeSpecName "kube-api-access-fblkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.585872 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-kube-api-access-8f7kg" (OuterVolumeSpecName: "kube-api-access-8f7kg") pod "b2e9a78e-8ff2-4edb-9ecc-35d304090da4" (UID: "b2e9a78e-8ff2-4edb-9ecc-35d304090da4"). InnerVolumeSpecName "kube-api-access-8f7kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682473 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74dbq\" (UniqueName: \"kubernetes.io/projected/9661e457-f424-4642-8551-c61fc2924ae9-kube-api-access-74dbq\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682511 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgffj\" (UniqueName: \"kubernetes.io/projected/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-kube-api-access-sgffj\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682523 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9661e457-f424-4642-8551-c61fc2924ae9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682532 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682541 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f7kg\" (UniqueName: \"kubernetes.io/projected/b2e9a78e-8ff2-4edb-9ecc-35d304090da4-kube-api-access-8f7kg\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682552 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c3adf-1008-43dd-a517-eae371754fcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682560 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fblkz\" (UniqueName: \"kubernetes.io/projected/900c3adf-1008-43dd-a517-eae371754fcd-kube-api-access-fblkz\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682569 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvrct\" (UniqueName: \"kubernetes.io/projected/3426fb78-4e09-4ddf-936d-48509c174c3f-kube-api-access-bvrct\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.682578 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ae96d4-a792-41df-85f7-3cf044fa8e0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.800345 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5219-account-create-update-bp54l" event={"ID":"9661e457-f424-4642-8551-c61fc2924ae9","Type":"ContainerDied","Data":"1fc786835807f8bac42b07f9acfab3101f1161962b98559ceee5fc5ea6465499"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.800735 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc786835807f8bac42b07f9acfab3101f1161962b98559ceee5fc5ea6465499" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.800489 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5219-account-create-update-bp54l" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.804630 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8b7k" event={"ID":"b2e9a78e-8ff2-4edb-9ecc-35d304090da4","Type":"ContainerDied","Data":"e3465fa373156a7520dca9ad3c017819e78b2acfa764a246f49981dbcb9d553b"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.804683 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8b7k" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.804703 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3465fa373156a7520dca9ad3c017819e78b2acfa764a246f49981dbcb9d553b" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.806440 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-42a2-account-create-update-c2ghz" event={"ID":"d5ae96d4-a792-41df-85f7-3cf044fa8e0c","Type":"ContainerDied","Data":"477e9316d9b60e3ad7c7e9828fb596715075af04447ad39f16c87d689849bec5"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.806497 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="477e9316d9b60e3ad7c7e9828fb596715075af04447ad39f16c87d689849bec5" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.806600 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-42a2-account-create-update-c2ghz" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.810134 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ncd8j" event={"ID":"900c3adf-1008-43dd-a517-eae371754fcd","Type":"ContainerDied","Data":"6ed1c9f4fc50c686081ec9f76357d0588da23641b902e47bf4438cc604d01237"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.810172 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ed1c9f4fc50c686081ec9f76357d0588da23641b902e47bf4438cc604d01237" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.810227 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ncd8j" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.815000 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8945-account-create-update-kldff" event={"ID":"3426fb78-4e09-4ddf-936d-48509c174c3f","Type":"ContainerDied","Data":"4782fcb14b8ace9adc96a19d5dc47d33523f554dbd20f84111ba153bfbaa8332"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.815026 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4782fcb14b8ace9adc96a19d5dc47d33523f554dbd20f84111ba153bfbaa8332" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.815026 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8945-account-create-update-kldff" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.817068 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gk65m" event={"ID":"2ee26f4f-790a-479f-b0b0-23c17a5aa642","Type":"ContainerDied","Data":"ff8c3fcf65568577b367e2b81104906877d25f96a490bac8cbb46ac043a19783"} Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.817100 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff8c3fcf65568577b367e2b81104906877d25f96a490bac8cbb46ac043a19783" Mar 19 20:23:15 crc kubenswrapper[4799]: I0319 20:23:15.817110 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gk65m" Mar 19 20:23:19 crc kubenswrapper[4799]: I0319 20:23:19.865970 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4492r" event={"ID":"c863d9fd-be20-4dfd-992e-02af944c3382","Type":"ContainerStarted","Data":"1ee26ff8168a50a5916e7c030c24c6f46cd670d9cd06949e93a8ccec96ddd5da"} Mar 19 20:23:20 crc kubenswrapper[4799]: I0319 20:23:20.451631 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:20 crc kubenswrapper[4799]: I0319 20:23:20.483714 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4492r" podStartSLOduration=4.012545729 podStartE2EDuration="9.483688647s" podCreationTimestamp="2026-03-19 20:23:11 +0000 UTC" firstStartedPulling="2026-03-19 20:23:13.190158294 +0000 UTC m=+1070.796111366" lastFinishedPulling="2026-03-19 20:23:18.661301222 +0000 UTC m=+1076.267254284" observedRunningTime="2026-03-19 20:23:19.889138784 +0000 UTC m=+1077.495091856" watchObservedRunningTime="2026-03-19 20:23:20.483688647 +0000 UTC m=+1078.089641729" Mar 19 20:23:20 crc kubenswrapper[4799]: I0319 20:23:20.542931 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-w5tpt"] Mar 19 20:23:20 crc kubenswrapper[4799]: I0319 20:23:20.543227 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="dnsmasq-dns" containerID="cri-o://732739c5eb90e7745fe42dbb831e94711711466b8cd8ab562b9e153e33dd7149" gracePeriod=10 Mar 19 20:23:20 crc kubenswrapper[4799]: I0319 20:23:20.875296 4799 generic.go:334] "Generic (PLEG): container finished" podID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerID="732739c5eb90e7745fe42dbb831e94711711466b8cd8ab562b9e153e33dd7149" exitCode=0 Mar 19 20:23:20 crc kubenswrapper[4799]: I0319 20:23:20.875401 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" event={"ID":"091c8182-6d14-4d5b-be74-afa07b3d201f","Type":"ContainerDied","Data":"732739c5eb90e7745fe42dbb831e94711711466b8cd8ab562b9e153e33dd7149"} Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.106200 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.698286 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.794751 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-swift-storage-0\") pod \"091c8182-6d14-4d5b-be74-afa07b3d201f\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.795133 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-nb\") pod \"091c8182-6d14-4d5b-be74-afa07b3d201f\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.795153 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db4qh\" (UniqueName: \"kubernetes.io/projected/091c8182-6d14-4d5b-be74-afa07b3d201f-kube-api-access-db4qh\") pod \"091c8182-6d14-4d5b-be74-afa07b3d201f\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.795235 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-config\") pod \"091c8182-6d14-4d5b-be74-afa07b3d201f\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.795335 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-svc\") pod \"091c8182-6d14-4d5b-be74-afa07b3d201f\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.795364 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-sb\") pod \"091c8182-6d14-4d5b-be74-afa07b3d201f\" (UID: \"091c8182-6d14-4d5b-be74-afa07b3d201f\") " Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.826637 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091c8182-6d14-4d5b-be74-afa07b3d201f-kube-api-access-db4qh" (OuterVolumeSpecName: "kube-api-access-db4qh") pod "091c8182-6d14-4d5b-be74-afa07b3d201f" (UID: "091c8182-6d14-4d5b-be74-afa07b3d201f"). InnerVolumeSpecName "kube-api-access-db4qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.868461 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "091c8182-6d14-4d5b-be74-afa07b3d201f" (UID: "091c8182-6d14-4d5b-be74-afa07b3d201f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.868503 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "091c8182-6d14-4d5b-be74-afa07b3d201f" (UID: "091c8182-6d14-4d5b-be74-afa07b3d201f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.881531 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-config" (OuterVolumeSpecName: "config") pod "091c8182-6d14-4d5b-be74-afa07b3d201f" (UID: "091c8182-6d14-4d5b-be74-afa07b3d201f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.885325 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "091c8182-6d14-4d5b-be74-afa07b3d201f" (UID: "091c8182-6d14-4d5b-be74-afa07b3d201f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.891843 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" event={"ID":"091c8182-6d14-4d5b-be74-afa07b3d201f","Type":"ContainerDied","Data":"2ad81c4cb557ef3020c70ca51ffe86d573f5c55a67c455c995696642ff6a9184"} Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.891905 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-w5tpt" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.891941 4799 scope.go:117] "RemoveContainer" containerID="732739c5eb90e7745fe42dbb831e94711711466b8cd8ab562b9e153e33dd7149" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.892207 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "091c8182-6d14-4d5b-be74-afa07b3d201f" (UID: "091c8182-6d14-4d5b-be74-afa07b3d201f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.897713 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.897734 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.897744 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.897752 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db4qh\" (UniqueName: \"kubernetes.io/projected/091c8182-6d14-4d5b-be74-afa07b3d201f-kube-api-access-db4qh\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.897763 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.897772 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/091c8182-6d14-4d5b-be74-afa07b3d201f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:21 crc kubenswrapper[4799]: I0319 20:23:21.948939 4799 scope.go:117] "RemoveContainer" containerID="3538bcede0a47f2d33ec71e8a9e32481c8848678421be9c85626d924f1bc3950" Mar 19 20:23:22 crc kubenswrapper[4799]: I0319 20:23:22.229617 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-w5tpt"] Mar 19 20:23:22 crc kubenswrapper[4799]: I0319 20:23:22.237940 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-w5tpt"] Mar 19 20:23:22 crc kubenswrapper[4799]: I0319 20:23:22.904880 4799 generic.go:334] "Generic (PLEG): container finished" podID="c863d9fd-be20-4dfd-992e-02af944c3382" containerID="1ee26ff8168a50a5916e7c030c24c6f46cd670d9cd06949e93a8ccec96ddd5da" exitCode=0 Mar 19 20:23:22 crc kubenswrapper[4799]: I0319 20:23:22.905030 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4492r" event={"ID":"c863d9fd-be20-4dfd-992e-02af944c3382","Type":"ContainerDied","Data":"1ee26ff8168a50a5916e7c030c24c6f46cd670d9cd06949e93a8ccec96ddd5da"} Mar 19 20:23:23 crc kubenswrapper[4799]: I0319 20:23:23.134043 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" path="/var/lib/kubelet/pods/091c8182-6d14-4d5b-be74-afa07b3d201f/volumes" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.281584 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.448599 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bg2b\" (UniqueName: \"kubernetes.io/projected/c863d9fd-be20-4dfd-992e-02af944c3382-kube-api-access-2bg2b\") pod \"c863d9fd-be20-4dfd-992e-02af944c3382\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.448730 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-combined-ca-bundle\") pod \"c863d9fd-be20-4dfd-992e-02af944c3382\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.448789 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-config-data\") pod \"c863d9fd-be20-4dfd-992e-02af944c3382\" (UID: \"c863d9fd-be20-4dfd-992e-02af944c3382\") " Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.455205 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c863d9fd-be20-4dfd-992e-02af944c3382-kube-api-access-2bg2b" (OuterVolumeSpecName: "kube-api-access-2bg2b") pod "c863d9fd-be20-4dfd-992e-02af944c3382" (UID: "c863d9fd-be20-4dfd-992e-02af944c3382"). InnerVolumeSpecName "kube-api-access-2bg2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.487621 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c863d9fd-be20-4dfd-992e-02af944c3382" (UID: "c863d9fd-be20-4dfd-992e-02af944c3382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.496712 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-config-data" (OuterVolumeSpecName: "config-data") pod "c863d9fd-be20-4dfd-992e-02af944c3382" (UID: "c863d9fd-be20-4dfd-992e-02af944c3382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.551512 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.551779 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c863d9fd-be20-4dfd-992e-02af944c3382-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.551858 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bg2b\" (UniqueName: \"kubernetes.io/projected/c863d9fd-be20-4dfd-992e-02af944c3382-kube-api-access-2bg2b\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.928677 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4492r" event={"ID":"c863d9fd-be20-4dfd-992e-02af944c3382","Type":"ContainerDied","Data":"32ba0713842120339cbf7747cbb3db877fd3b106ca1d189f8c94a0e39eb3da36"} Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.928736 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ba0713842120339cbf7747cbb3db877fd3b106ca1d189f8c94a0e39eb3da36" Mar 19 20:23:24 crc kubenswrapper[4799]: I0319 20:23:24.929229 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4492r" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.206683 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q74bx"] Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207017 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="init" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207032 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="init" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207048 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3426fb78-4e09-4ddf-936d-48509c174c3f" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207054 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3426fb78-4e09-4ddf-936d-48509c174c3f" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207062 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e9a78e-8ff2-4edb-9ecc-35d304090da4" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207067 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e9a78e-8ff2-4edb-9ecc-35d304090da4" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207079 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="dnsmasq-dns" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207084 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="dnsmasq-dns" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207097 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee26f4f-790a-479f-b0b0-23c17a5aa642" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207104 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee26f4f-790a-479f-b0b0-23c17a5aa642" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207112 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c863d9fd-be20-4dfd-992e-02af944c3382" containerName="keystone-db-sync" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207119 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c863d9fd-be20-4dfd-992e-02af944c3382" containerName="keystone-db-sync" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207128 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9661e457-f424-4642-8551-c61fc2924ae9" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207134 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9661e457-f424-4642-8551-c61fc2924ae9" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207142 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ae96d4-a792-41df-85f7-3cf044fa8e0c" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207148 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ae96d4-a792-41df-85f7-3cf044fa8e0c" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: E0319 20:23:25.207162 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900c3adf-1008-43dd-a517-eae371754fcd" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207168 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="900c3adf-1008-43dd-a517-eae371754fcd" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207307 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="091c8182-6d14-4d5b-be74-afa07b3d201f" containerName="dnsmasq-dns" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207320 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e9a78e-8ff2-4edb-9ecc-35d304090da4" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207327 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="900c3adf-1008-43dd-a517-eae371754fcd" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207334 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ae96d4-a792-41df-85f7-3cf044fa8e0c" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207350 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3426fb78-4e09-4ddf-936d-48509c174c3f" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207361 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c863d9fd-be20-4dfd-992e-02af944c3382" containerName="keystone-db-sync" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207367 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee26f4f-790a-479f-b0b0-23c17a5aa642" containerName="mariadb-database-create" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.207375 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9661e457-f424-4642-8551-c61fc2924ae9" containerName="mariadb-account-create-update" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.209355 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.217852 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.217969 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vdqvk" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.218052 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.218299 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.218454 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.238072 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q74bx"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.269628 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c7999457-lbvhl"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.278425 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.370950 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-nb\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371004 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-svc\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371027 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-credential-keys\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371064 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvkwz\" (UniqueName: \"kubernetes.io/projected/ea453291-0fc9-4f73-8d5a-762b02101d81-kube-api-access-pvkwz\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371085 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-fernet-keys\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371122 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-config\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371142 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpx4d\" (UniqueName: \"kubernetes.io/projected/d77894fe-c799-4924-9ef1-563270ec8035-kube-api-access-lpx4d\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371212 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-combined-ca-bundle\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371399 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-sb\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371428 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-swift-storage-0\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371450 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-config-data\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.371480 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-scripts\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.385869 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-lbvhl"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472364 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-combined-ca-bundle\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472479 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-sb\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472502 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-swift-storage-0\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472522 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-config-data\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472552 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-scripts\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472572 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-nb\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472589 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-svc\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472603 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-credential-keys\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472629 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvkwz\" (UniqueName: \"kubernetes.io/projected/ea453291-0fc9-4f73-8d5a-762b02101d81-kube-api-access-pvkwz\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472643 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-fernet-keys\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472676 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-config\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.472690 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpx4d\" (UniqueName: \"kubernetes.io/projected/d77894fe-c799-4924-9ef1-563270ec8035-kube-api-access-lpx4d\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.481561 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-combined-ca-bundle\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.482225 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-sb\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.482742 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-swift-storage-0\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.486076 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-config\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.486613 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-nb\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.486976 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-config-data\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.488423 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-svc\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.491898 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-scripts\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.504676 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-credential-keys\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.512946 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-fernet-keys\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.522706 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5679479bf7-5r6zg"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.523950 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.528250 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.534096 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpx4d\" (UniqueName: \"kubernetes.io/projected/d77894fe-c799-4924-9ef1-563270ec8035-kube-api-access-lpx4d\") pod \"dnsmasq-dns-84c7999457-lbvhl\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.550908 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvkwz\" (UniqueName: \"kubernetes.io/projected/ea453291-0fc9-4f73-8d5a-762b02101d81-kube-api-access-pvkwz\") pod \"keystone-bootstrap-q74bx\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.551647 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-2b2hx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.551909 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.552179 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.569554 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5679479bf7-5r6zg"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.582278 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-config-data\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.582338 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstgm\" (UniqueName: \"kubernetes.io/projected/6b887e22-48a3-4d49-96d7-5788d8e69ef2-kube-api-access-jstgm\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.582373 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-scripts\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.582443 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b887e22-48a3-4d49-96d7-5788d8e69ef2-logs\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.582461 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b887e22-48a3-4d49-96d7-5788d8e69ef2-horizon-secret-key\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.635697 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jnnwh"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.636933 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.640116 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.645105 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.667837 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jnnwh"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.681217 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rtfrr" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.686345 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-config-data\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.686412 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstgm\" (UniqueName: \"kubernetes.io/projected/6b887e22-48a3-4d49-96d7-5788d8e69ef2-kube-api-access-jstgm\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.686448 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-scripts\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.686495 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b887e22-48a3-4d49-96d7-5788d8e69ef2-logs\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.686511 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b887e22-48a3-4d49-96d7-5788d8e69ef2-horizon-secret-key\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.688044 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-config-data\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.688630 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-scripts\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.688810 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b887e22-48a3-4d49-96d7-5788d8e69ef2-logs\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.690139 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.720732 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b887e22-48a3-4d49-96d7-5788d8e69ef2-horizon-secret-key\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.766672 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mdm7f"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.767796 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.771779 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6ln6j" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.771954 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.772074 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.772630 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstgm\" (UniqueName: \"kubernetes.io/projected/6b887e22-48a3-4d49-96d7-5788d8e69ef2-kube-api-access-jstgm\") pod \"horizon-5679479bf7-5r6zg\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.788154 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-combined-ca-bundle\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.788205 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-scripts\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.788248 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-db-sync-config-data\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.788277 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51a34e40-cb35-4589-9fcd-20130bd7831f-etc-machine-id\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.788332 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-config-data\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.788365 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56wz\" (UniqueName: \"kubernetes.io/projected/51a34e40-cb35-4589-9fcd-20130bd7831f-kube-api-access-q56wz\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.804974 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mdm7f"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.825184 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.828193 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.831186 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.849931 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.873801 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.882698 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.893436 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-566c4bf65-cxp7g"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.894711 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.895910 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-combined-ca-bundle\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.895934 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-scripts\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.895957 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-scripts\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.895979 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-combined-ca-bundle\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.895996 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896010 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-config-data\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896039 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-db-sync-config-data\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896064 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51a34e40-cb35-4589-9fcd-20130bd7831f-etc-machine-id\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896270 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-config\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896294 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2q48\" (UniqueName: \"kubernetes.io/projected/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-kube-api-access-g2q48\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896315 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-run-httpd\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896341 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-config-data\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896362 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-log-httpd\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896397 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56wz\" (UniqueName: \"kubernetes.io/projected/51a34e40-cb35-4589-9fcd-20130bd7831f-kube-api-access-q56wz\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896412 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklsf\" (UniqueName: \"kubernetes.io/projected/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-kube-api-access-qklsf\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.896429 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.899744 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51a34e40-cb35-4589-9fcd-20130bd7831f-etc-machine-id\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.899957 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.901677 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.902644 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-scripts\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.908725 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-config-data\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.910054 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-combined-ca-bundle\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.912024 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5fzj4" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.912333 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.912471 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.914559 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.918426 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-db-sync-config-data\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.920770 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.930323 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-566c4bf65-cxp7g"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.947162 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56wz\" (UniqueName: \"kubernetes.io/projected/51a34e40-cb35-4589-9fcd-20130bd7831f-kube-api-access-q56wz\") pod \"cinder-db-sync-jnnwh\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.954874 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.971200 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.974360 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-s6gmj"] Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.976918 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.980567 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2875p" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.981206 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 20:23:25 crc kubenswrapper[4799]: I0319 20:23:25.984354 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000141 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68vd\" (UniqueName: \"kubernetes.io/projected/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-kube-api-access-z68vd\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000189 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-logs\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000214 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-config\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000243 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2q48\" (UniqueName: \"kubernetes.io/projected/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-kube-api-access-g2q48\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000273 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-run-httpd\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000308 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000337 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-log-httpd\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000359 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-config-data\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000377 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklsf\" (UniqueName: \"kubernetes.io/projected/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-kube-api-access-qklsf\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000409 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000429 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000454 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000474 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-scripts\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000496 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxfdl\" (UniqueName: \"kubernetes.io/projected/2bac784a-8455-4061-a876-94e8b394a40d-kube-api-access-mxfdl\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000518 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000542 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-scripts\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000558 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-horizon-secret-key\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000575 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000604 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-combined-ca-bundle\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000622 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000640 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-config-data\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000661 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-logs\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.000693 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.007995 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-run-httpd\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.011166 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-log-httpd\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.012011 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s6gmj"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.030451 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-lbvhl"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.032684 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-combined-ca-bundle\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.033347 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-config-data\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.038655 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.038970 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-config\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.039531 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.041882 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-scripts\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.053865 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2q48\" (UniqueName: \"kubernetes.io/projected/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-kube-api-access-g2q48\") pod \"neutron-db-sync-mdm7f\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.055093 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-h4sjb"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.056309 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.077833 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklsf\" (UniqueName: \"kubernetes.io/projected/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-kube-api-access-qklsf\") pod \"ceilometer-0\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.085748 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-h4sjb"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.095886 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rcxnv"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.096896 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.098469 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.108273 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109004 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-logs\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109037 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-combined-ca-bundle\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109069 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-scripts\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109101 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-config\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109133 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109165 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxgvf\" (UniqueName: \"kubernetes.io/projected/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-kube-api-access-sxgvf\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109210 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-config-data\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109241 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-svc\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109259 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109294 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109309 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-scripts\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109329 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxfdl\" (UniqueName: \"kubernetes.io/projected/2bac784a-8455-4061-a876-94e8b394a40d-kube-api-access-mxfdl\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109351 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-config-data\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109369 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109405 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109443 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-horizon-secret-key\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109463 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109533 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-logs\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109551 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkq7c\" (UniqueName: \"kubernetes.io/projected/d181492c-247b-49bc-8e40-7a531d76011d-kube-api-access-lkq7c\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109568 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.109586 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.110143 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-scripts\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.111183 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-config-data\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.112283 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.112672 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-horizon-secret-key\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.113117 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.113164 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z68vd\" (UniqueName: \"kubernetes.io/projected/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-kube-api-access-z68vd\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.113210 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-logs\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.113528 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-logs\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.113754 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.124480 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.124691 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k2v25" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.124701 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-logs\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.125146 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.127834 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.130369 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.145823 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rcxnv"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.147294 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.148177 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.161116 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.162026 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68vd\" (UniqueName: \"kubernetes.io/projected/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-kube-api-access-z68vd\") pod \"horizon-566c4bf65-cxp7g\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.165467 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxfdl\" (UniqueName: \"kubernetes.io/projected/2bac784a-8455-4061-a876-94e8b394a40d-kube-api-access-mxfdl\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.171844 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.177208 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.179251 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215508 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215682 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-scripts\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215711 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-config\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215739 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxgvf\" (UniqueName: \"kubernetes.io/projected/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-kube-api-access-sxgvf\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215773 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqkhg\" (UniqueName: \"kubernetes.io/projected/44606ca0-c233-4390-95af-93a2f6ad55ea-kube-api-access-dqkhg\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215811 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-svc\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215828 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215858 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-config-data\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215878 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215898 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-combined-ca-bundle\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215944 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.215990 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkq7c\" (UniqueName: \"kubernetes.io/projected/d181492c-247b-49bc-8e40-7a531d76011d-kube-api-access-lkq7c\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216023 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216041 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktcs4\" (UniqueName: \"kubernetes.io/projected/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-kube-api-access-ktcs4\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216073 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216106 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216129 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216149 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216166 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-logs\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216185 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-combined-ca-bundle\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216204 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-db-sync-config-data\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.216792 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.217973 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-config\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.219852 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-svc\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.221471 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.221978 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.222721 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.223241 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-logs\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.227647 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.229325 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-config-data\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.234717 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-scripts\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.248530 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxgvf\" (UniqueName: \"kubernetes.io/projected/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-kube-api-access-sxgvf\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.251655 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkq7c\" (UniqueName: \"kubernetes.io/projected/d181492c-247b-49bc-8e40-7a531d76011d-kube-api-access-lkq7c\") pod \"dnsmasq-dns-5d5dc7cf69-h4sjb\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.258753 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-combined-ca-bundle\") pod \"placement-db-sync-s6gmj\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.265519 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.311986 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s6gmj" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktcs4\" (UniqueName: \"kubernetes.io/projected/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-kube-api-access-ktcs4\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317044 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317086 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317108 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317130 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317152 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-db-sync-config-data\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317171 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317205 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqkhg\" (UniqueName: \"kubernetes.io/projected/44606ca0-c233-4390-95af-93a2f6ad55ea-kube-api-access-dqkhg\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317231 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317262 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-combined-ca-bundle\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317292 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317591 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.317661 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.318303 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.322922 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.326858 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.330113 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.336801 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-db-sync-config-data\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.337128 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.341742 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-combined-ca-bundle\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.373907 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktcs4\" (UniqueName: \"kubernetes.io/projected/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-kube-api-access-ktcs4\") pod \"barbican-db-sync-rcxnv\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.379807 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.380883 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqkhg\" (UniqueName: \"kubernetes.io/projected/44606ca0-c233-4390-95af-93a2f6ad55ea-kube-api-access-dqkhg\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.413046 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.417811 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.447887 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.465805 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-lbvhl"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.699754 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5679479bf7-5r6zg"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.715909 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q74bx"] Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.838978 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:23:26 crc kubenswrapper[4799]: W0319 20:23:26.847804 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e306dff_f3ce_48df_b9d4_aef952f3e0a5.slice/crio-233d9264946f1281ac764a6bb708a88426e707a71c39f707e965f381c77eee1b WatchSource:0}: Error finding container 233d9264946f1281ac764a6bb708a88426e707a71c39f707e965f381c77eee1b: Status 404 returned error can't find the container with id 233d9264946f1281ac764a6bb708a88426e707a71c39f707e965f381c77eee1b Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.965116 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q74bx" event={"ID":"ea453291-0fc9-4f73-8d5a-762b02101d81","Type":"ContainerStarted","Data":"46e9a2e2b408783696a5ac06152210d1cd5ac8edd8f6a8b42c724202356e5a2f"} Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.965173 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q74bx" event={"ID":"ea453291-0fc9-4f73-8d5a-762b02101d81","Type":"ContainerStarted","Data":"a279e379228fb5b893b9a2158ce1a82a02ae7c56da11c9aa81f30c0a053749cc"} Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.975454 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5679479bf7-5r6zg" event={"ID":"6b887e22-48a3-4d49-96d7-5788d8e69ef2","Type":"ContainerStarted","Data":"20b371064a8f6d7874aa5d1023fd3c3f1f9249bec68cfe811c6461b88d0ee00a"} Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.980565 4799 generic.go:334] "Generic (PLEG): container finished" podID="d77894fe-c799-4924-9ef1-563270ec8035" containerID="99ff092341fd74402c26074d10e62465418947038232bb0ec487c83118ec3d8d" exitCode=0 Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.980613 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" event={"ID":"d77894fe-c799-4924-9ef1-563270ec8035","Type":"ContainerDied","Data":"99ff092341fd74402c26074d10e62465418947038232bb0ec487c83118ec3d8d"} Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.980642 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" event={"ID":"d77894fe-c799-4924-9ef1-563270ec8035","Type":"ContainerStarted","Data":"5945c127051bfee0c67f3d5d4b039afc614c8bab5e90ec77e667870bfd062d09"} Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.984110 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerStarted","Data":"233d9264946f1281ac764a6bb708a88426e707a71c39f707e965f381c77eee1b"} Mar 19 20:23:26 crc kubenswrapper[4799]: I0319 20:23:26.985939 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q74bx" podStartSLOduration=1.985928698 podStartE2EDuration="1.985928698s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:26.985514307 +0000 UTC m=+1084.591467399" watchObservedRunningTime="2026-03-19 20:23:26.985928698 +0000 UTC m=+1084.591881770" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.265686 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mdm7f"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.292505 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jnnwh"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.317644 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-566c4bf65-cxp7g"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.325968 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s6gmj"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.466205 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.551658 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-h4sjb"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.562955 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-nb\") pod \"d77894fe-c799-4924-9ef1-563270ec8035\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.562993 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-svc\") pod \"d77894fe-c799-4924-9ef1-563270ec8035\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.563043 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-swift-storage-0\") pod \"d77894fe-c799-4924-9ef1-563270ec8035\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.563084 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpx4d\" (UniqueName: \"kubernetes.io/projected/d77894fe-c799-4924-9ef1-563270ec8035-kube-api-access-lpx4d\") pod \"d77894fe-c799-4924-9ef1-563270ec8035\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.563161 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-sb\") pod \"d77894fe-c799-4924-9ef1-563270ec8035\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.563251 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-config\") pod \"d77894fe-c799-4924-9ef1-563270ec8035\" (UID: \"d77894fe-c799-4924-9ef1-563270ec8035\") " Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.571962 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77894fe-c799-4924-9ef1-563270ec8035-kube-api-access-lpx4d" (OuterVolumeSpecName: "kube-api-access-lpx4d") pod "d77894fe-c799-4924-9ef1-563270ec8035" (UID: "d77894fe-c799-4924-9ef1-563270ec8035"). InnerVolumeSpecName "kube-api-access-lpx4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.576011 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rcxnv"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.620136 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d77894fe-c799-4924-9ef1-563270ec8035" (UID: "d77894fe-c799-4924-9ef1-563270ec8035"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.653890 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-config" (OuterVolumeSpecName: "config") pod "d77894fe-c799-4924-9ef1-563270ec8035" (UID: "d77894fe-c799-4924-9ef1-563270ec8035"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.667668 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.667697 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpx4d\" (UniqueName: \"kubernetes.io/projected/d77894fe-c799-4924-9ef1-563270ec8035-kube-api-access-lpx4d\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.667712 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.676954 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d77894fe-c799-4924-9ef1-563270ec8035" (UID: "d77894fe-c799-4924-9ef1-563270ec8035"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.748395 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d77894fe-c799-4924-9ef1-563270ec8035" (UID: "d77894fe-c799-4924-9ef1-563270ec8035"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.748865 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d77894fe-c799-4924-9ef1-563270ec8035" (UID: "d77894fe-c799-4924-9ef1-563270ec8035"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.774428 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.774474 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.774486 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d77894fe-c799-4924-9ef1-563270ec8035-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.791864 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:27 crc kubenswrapper[4799]: I0319 20:23:27.929774 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.091610 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-566c4bf65-cxp7g"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.132199 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.137790 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rcxnv" event={"ID":"a40c3083-66e1-4fb7-b4b4-88fa2935cb49","Type":"ContainerStarted","Data":"c5f98c6a80037c286f553835f815e10a625b7e3257b03c9cf63919da36c49791"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.165807 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-665b847dc-rfzv9"] Mar 19 20:23:28 crc kubenswrapper[4799]: E0319 20:23:28.166215 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77894fe-c799-4924-9ef1-563270ec8035" containerName="init" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.166227 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77894fe-c799-4924-9ef1-563270ec8035" containerName="init" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.166374 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77894fe-c799-4924-9ef1-563270ec8035" containerName="init" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.167403 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.170058 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665b847dc-rfzv9"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.178423 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.185562 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" event={"ID":"d181492c-247b-49bc-8e40-7a531d76011d","Type":"ContainerStarted","Data":"57ebba1e06020a68b7d77c79992e2d29469f4575089d8e59fd4a054d7e2aa213"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.204850 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" event={"ID":"d77894fe-c799-4924-9ef1-563270ec8035","Type":"ContainerDied","Data":"5945c127051bfee0c67f3d5d4b039afc614c8bab5e90ec77e667870bfd062d09"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.204898 4799 scope.go:117] "RemoveContainer" containerID="99ff092341fd74402c26074d10e62465418947038232bb0ec487c83118ec3d8d" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.205006 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-lbvhl" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.215399 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mdm7f" event={"ID":"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6","Type":"ContainerStarted","Data":"d9a73fe93c5da4a1b0141d12844c0ea46ffb7817596288a956a936e5b578eb9f"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.215446 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mdm7f" event={"ID":"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6","Type":"ContainerStarted","Data":"9ecceff605a0c72b41d027a6d3944306f4190f0cb06fa1f5e2a54d8b93ebbb5a"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.223458 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnnwh" event={"ID":"51a34e40-cb35-4589-9fcd-20130bd7831f","Type":"ContainerStarted","Data":"4fcb078cc059a6cb1ef0587c40c790ba66d4a6b60db195e4f46279f722c8da27"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.236201 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mdm7f" podStartSLOduration=3.236185785 podStartE2EDuration="3.236185785s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:28.234670823 +0000 UTC m=+1085.840623895" watchObservedRunningTime="2026-03-19 20:23:28.236185785 +0000 UTC m=+1085.842138857" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.258524 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c4bf65-cxp7g" event={"ID":"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4","Type":"ContainerStarted","Data":"d205baf5c30c80723fad860d7a18e7fb524edbb31a6a7f8df6c09d3e74a4a351"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.263915 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s6gmj" event={"ID":"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd","Type":"ContainerStarted","Data":"878ae09b8a2a3df9edd6e2f5c3c4ef254ad58d40ef4f19a190d70f30580aea68"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.269762 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44606ca0-c233-4390-95af-93a2f6ad55ea","Type":"ContainerStarted","Data":"37c8d616661a680fa74263a68250f3dc04b446337c2d5cb7d5f035dd626da74e"} Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.299609 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-config-data\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.299686 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-scripts\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.299772 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8cb96f-24fe-46e8-a18f-2175ed50b782-logs\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.299935 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8cb96f-24fe-46e8-a18f-2175ed50b782-horizon-secret-key\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.299984 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729xh\" (UniqueName: \"kubernetes.io/projected/9c8cb96f-24fe-46e8-a18f-2175ed50b782-kube-api-access-729xh\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.361531 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-lbvhl"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.370059 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-lbvhl"] Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.403494 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8cb96f-24fe-46e8-a18f-2175ed50b782-logs\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.403636 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8cb96f-24fe-46e8-a18f-2175ed50b782-horizon-secret-key\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.403662 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729xh\" (UniqueName: \"kubernetes.io/projected/9c8cb96f-24fe-46e8-a18f-2175ed50b782-kube-api-access-729xh\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.403684 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-config-data\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.403742 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-scripts\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.404117 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8cb96f-24fe-46e8-a18f-2175ed50b782-logs\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.405064 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-scripts\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.405257 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-config-data\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.414288 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8cb96f-24fe-46e8-a18f-2175ed50b782-horizon-secret-key\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.420037 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729xh\" (UniqueName: \"kubernetes.io/projected/9c8cb96f-24fe-46e8-a18f-2175ed50b782-kube-api-access-729xh\") pod \"horizon-665b847dc-rfzv9\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.532293 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:28 crc kubenswrapper[4799]: W0319 20:23:28.538903 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bac784a_8455_4061_a876_94e8b394a40d.slice/crio-7eec7926a14ba6c80aa4d3e55cf744428e1f48351b2d8132c16d2f35077a225d WatchSource:0}: Error finding container 7eec7926a14ba6c80aa4d3e55cf744428e1f48351b2d8132c16d2f35077a225d: Status 404 returned error can't find the container with id 7eec7926a14ba6c80aa4d3e55cf744428e1f48351b2d8132c16d2f35077a225d Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.569904 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.756088 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:23:28 crc kubenswrapper[4799]: I0319 20:23:28.756691 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:23:29 crc kubenswrapper[4799]: I0319 20:23:29.149739 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77894fe-c799-4924-9ef1-563270ec8035" path="/var/lib/kubelet/pods/d77894fe-c799-4924-9ef1-563270ec8035/volumes" Mar 19 20:23:29 crc kubenswrapper[4799]: I0319 20:23:29.318939 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bac784a-8455-4061-a876-94e8b394a40d","Type":"ContainerStarted","Data":"7eec7926a14ba6c80aa4d3e55cf744428e1f48351b2d8132c16d2f35077a225d"} Mar 19 20:23:29 crc kubenswrapper[4799]: I0319 20:23:29.331536 4799 generic.go:334] "Generic (PLEG): container finished" podID="d181492c-247b-49bc-8e40-7a531d76011d" containerID="9dd1345a95985bd5e9ab8c472616b9969b3978cc7c77af256b7420b2660baf87" exitCode=0 Mar 19 20:23:29 crc kubenswrapper[4799]: I0319 20:23:29.331594 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" event={"ID":"d181492c-247b-49bc-8e40-7a531d76011d","Type":"ContainerDied","Data":"9dd1345a95985bd5e9ab8c472616b9969b3978cc7c77af256b7420b2660baf87"} Mar 19 20:23:29 crc kubenswrapper[4799]: I0319 20:23:29.336237 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665b847dc-rfzv9"] Mar 19 20:23:29 crc kubenswrapper[4799]: I0319 20:23:29.340434 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44606ca0-c233-4390-95af-93a2f6ad55ea","Type":"ContainerStarted","Data":"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25"} Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.413741 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-log" containerID="cri-o://a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25" gracePeriod=30 Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.414506 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44606ca0-c233-4390-95af-93a2f6ad55ea","Type":"ContainerStarted","Data":"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8"} Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.414765 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-httpd" containerID="cri-o://982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8" gracePeriod=30 Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.428014 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bac784a-8455-4061-a876-94e8b394a40d","Type":"ContainerStarted","Data":"41fa60a48b600cee8524ec84a16debdd0e42ec1624f23a612830ca6113a95fb0"} Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.434874 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" event={"ID":"d181492c-247b-49bc-8e40-7a531d76011d","Type":"ContainerStarted","Data":"16ef29d8555bec2f8f01fda7a20bc1d2ad121f4f0a40152be30669cf8becbd43"} Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.435550 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.443601 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.443584425 podStartE2EDuration="5.443584425s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:30.438523247 +0000 UTC m=+1088.044476319" watchObservedRunningTime="2026-03-19 20:23:30.443584425 +0000 UTC m=+1088.049537497" Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.449790 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665b847dc-rfzv9" event={"ID":"9c8cb96f-24fe-46e8-a18f-2175ed50b782","Type":"ContainerStarted","Data":"dbae282bc1dd6b475b0147be6c7dedfc3351b312d3b0da6e8135bd1c0563dd97"} Mar 19 20:23:30 crc kubenswrapper[4799]: I0319 20:23:30.477142 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" podStartSLOduration=5.477125866 podStartE2EDuration="5.477125866s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:30.474845433 +0000 UTC m=+1088.080798505" watchObservedRunningTime="2026-03-19 20:23:30.477125866 +0000 UTC m=+1088.083078938" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.179650 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270320 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-combined-ca-bundle\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270458 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-logs\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270493 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270517 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-config-data\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270562 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-scripts\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270587 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-internal-tls-certs\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270645 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqkhg\" (UniqueName: \"kubernetes.io/projected/44606ca0-c233-4390-95af-93a2f6ad55ea-kube-api-access-dqkhg\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.270679 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-httpd-run\") pod \"44606ca0-c233-4390-95af-93a2f6ad55ea\" (UID: \"44606ca0-c233-4390-95af-93a2f6ad55ea\") " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.271047 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-logs" (OuterVolumeSpecName: "logs") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.271457 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.271953 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.277156 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.278032 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-scripts" (OuterVolumeSpecName: "scripts") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.279099 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44606ca0-c233-4390-95af-93a2f6ad55ea-kube-api-access-dqkhg" (OuterVolumeSpecName: "kube-api-access-dqkhg") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "kube-api-access-dqkhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.317274 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.353523 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-config-data" (OuterVolumeSpecName: "config-data") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.366952 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44606ca0-c233-4390-95af-93a2f6ad55ea" (UID: "44606ca0-c233-4390-95af-93a2f6ad55ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373244 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44606ca0-c233-4390-95af-93a2f6ad55ea-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373269 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373300 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373310 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373319 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373326 4799 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44606ca0-c233-4390-95af-93a2f6ad55ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.373337 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqkhg\" (UniqueName: \"kubernetes.io/projected/44606ca0-c233-4390-95af-93a2f6ad55ea-kube-api-access-dqkhg\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.393230 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459662 4799 generic.go:334] "Generic (PLEG): container finished" podID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerID="982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8" exitCode=0 Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459701 4799 generic.go:334] "Generic (PLEG): container finished" podID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerID="a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25" exitCode=143 Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459734 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44606ca0-c233-4390-95af-93a2f6ad55ea","Type":"ContainerDied","Data":"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8"} Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459760 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44606ca0-c233-4390-95af-93a2f6ad55ea","Type":"ContainerDied","Data":"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25"} Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459772 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"44606ca0-c233-4390-95af-93a2f6ad55ea","Type":"ContainerDied","Data":"37c8d616661a680fa74263a68250f3dc04b446337c2d5cb7d5f035dd626da74e"} Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459787 4799 scope.go:117] "RemoveContainer" containerID="982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.459901 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.466864 4799 generic.go:334] "Generic (PLEG): container finished" podID="ea453291-0fc9-4f73-8d5a-762b02101d81" containerID="46e9a2e2b408783696a5ac06152210d1cd5ac8edd8f6a8b42c724202356e5a2f" exitCode=0 Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.466923 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q74bx" event={"ID":"ea453291-0fc9-4f73-8d5a-762b02101d81","Type":"ContainerDied","Data":"46e9a2e2b408783696a5ac06152210d1cd5ac8edd8f6a8b42c724202356e5a2f"} Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.475806 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.477221 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-log" containerID="cri-o://41fa60a48b600cee8524ec84a16debdd0e42ec1624f23a612830ca6113a95fb0" gracePeriod=30 Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.478028 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bac784a-8455-4061-a876-94e8b394a40d","Type":"ContainerStarted","Data":"570dc7e6aa98272f82f0e2f81d8147cb1c80531ae358e206d2a01bb340be4d32"} Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.478120 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-httpd" containerID="cri-o://570dc7e6aa98272f82f0e2f81d8147cb1c80531ae358e206d2a01bb340be4d32" gracePeriod=30 Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.509529 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.517255 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.524802 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.524786413 podStartE2EDuration="6.524786413s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:31.520574557 +0000 UTC m=+1089.126527629" watchObservedRunningTime="2026-03-19 20:23:31.524786413 +0000 UTC m=+1089.130739485" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.557841 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:31 crc kubenswrapper[4799]: E0319 20:23:31.558196 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-log" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.558212 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-log" Mar 19 20:23:31 crc kubenswrapper[4799]: E0319 20:23:31.558225 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-httpd" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.558232 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-httpd" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.558402 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-httpd" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.558413 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" containerName="glance-log" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.573198 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.580763 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.580993 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.584980 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.683403 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.683490 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.683517 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-logs\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.683985 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qxg\" (UniqueName: \"kubernetes.io/projected/46507e9b-c305-48e4-9a5a-84cb3fd1e695-kube-api-access-j4qxg\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.684057 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.684153 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.684215 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.684248 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.786477 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qxg\" (UniqueName: \"kubernetes.io/projected/46507e9b-c305-48e4-9a5a-84cb3fd1e695-kube-api-access-j4qxg\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.786866 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.786925 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.786959 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.786984 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.787032 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.787051 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.787067 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-logs\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.787495 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-logs\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.788511 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.789301 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.793840 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.794316 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.797943 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.804511 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.808494 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qxg\" (UniqueName: \"kubernetes.io/projected/46507e9b-c305-48e4-9a5a-84cb3fd1e695-kube-api-access-j4qxg\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.826300 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:23:31 crc kubenswrapper[4799]: I0319 20:23:31.906599 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:32 crc kubenswrapper[4799]: I0319 20:23:32.495160 4799 generic.go:334] "Generic (PLEG): container finished" podID="2bac784a-8455-4061-a876-94e8b394a40d" containerID="570dc7e6aa98272f82f0e2f81d8147cb1c80531ae358e206d2a01bb340be4d32" exitCode=0 Mar 19 20:23:32 crc kubenswrapper[4799]: I0319 20:23:32.495195 4799 generic.go:334] "Generic (PLEG): container finished" podID="2bac784a-8455-4061-a876-94e8b394a40d" containerID="41fa60a48b600cee8524ec84a16debdd0e42ec1624f23a612830ca6113a95fb0" exitCode=143 Mar 19 20:23:32 crc kubenswrapper[4799]: I0319 20:23:32.495531 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bac784a-8455-4061-a876-94e8b394a40d","Type":"ContainerDied","Data":"570dc7e6aa98272f82f0e2f81d8147cb1c80531ae358e206d2a01bb340be4d32"} Mar 19 20:23:32 crc kubenswrapper[4799]: I0319 20:23:32.495577 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bac784a-8455-4061-a876-94e8b394a40d","Type":"ContainerDied","Data":"41fa60a48b600cee8524ec84a16debdd0e42ec1624f23a612830ca6113a95fb0"} Mar 19 20:23:33 crc kubenswrapper[4799]: I0319 20:23:33.134330 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44606ca0-c233-4390-95af-93a2f6ad55ea" path="/var/lib/kubelet/pods/44606ca0-c233-4390-95af-93a2f6ad55ea/volumes" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.608669 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5679479bf7-5r6zg"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.661062 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8869c89f8-jvpbt"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.662496 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.675046 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.680474 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8869c89f8-jvpbt"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752152 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-tls-certs\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752200 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-scripts\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752228 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-combined-ca-bundle\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752268 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-config-data\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752414 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-secret-key\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752431 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf2a634-2499-4ea6-853f-9c8852d65e01-logs\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.752455 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmpjj\" (UniqueName: \"kubernetes.io/projected/ecf2a634-2499-4ea6-853f-9c8852d65e01-kube-api-access-zmpjj\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.762465 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.782615 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665b847dc-rfzv9"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.791901 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56454c8868-kxl79"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.793830 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.825376 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56454c8868-kxl79"] Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.854969 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-config-data\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855076 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9b7bec9-2633-410d-be4e-c65c9a903a38-config-data\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855100 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7bec9-2633-410d-be4e-c65c9a903a38-logs\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855127 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-horizon-tls-certs\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855188 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6nxc\" (UniqueName: \"kubernetes.io/projected/d9b7bec9-2633-410d-be4e-c65c9a903a38-kube-api-access-j6nxc\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855211 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7bec9-2633-410d-be4e-c65c9a903a38-scripts\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855273 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-horizon-secret-key\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855301 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-secret-key\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855331 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf2a634-2499-4ea6-853f-9c8852d65e01-logs\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855419 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmpjj\" (UniqueName: \"kubernetes.io/projected/ecf2a634-2499-4ea6-853f-9c8852d65e01-kube-api-access-zmpjj\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855451 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-tls-certs\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855496 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-scripts\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855519 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-combined-ca-bundle\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.855541 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-combined-ca-bundle\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.856662 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-config-data\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.856867 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf2a634-2499-4ea6-853f-9c8852d65e01-logs\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.859991 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-scripts\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.861771 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-secret-key\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.861879 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-combined-ca-bundle\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.869856 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-tls-certs\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.872539 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmpjj\" (UniqueName: \"kubernetes.io/projected/ecf2a634-2499-4ea6-853f-9c8852d65e01-kube-api-access-zmpjj\") pod \"horizon-8869c89f8-jvpbt\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958151 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-horizon-secret-key\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958255 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-combined-ca-bundle\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958321 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9b7bec9-2633-410d-be4e-c65c9a903a38-config-data\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958343 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7bec9-2633-410d-be4e-c65c9a903a38-logs\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958369 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-horizon-tls-certs\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958402 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6nxc\" (UniqueName: \"kubernetes.io/projected/d9b7bec9-2633-410d-be4e-c65c9a903a38-kube-api-access-j6nxc\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.958424 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7bec9-2633-410d-be4e-c65c9a903a38-scripts\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.959079 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9b7bec9-2633-410d-be4e-c65c9a903a38-scripts\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.959119 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7bec9-2633-410d-be4e-c65c9a903a38-logs\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.959879 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9b7bec9-2633-410d-be4e-c65c9a903a38-config-data\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.962837 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-horizon-tls-certs\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.963824 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-combined-ca-bundle\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.965765 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9b7bec9-2633-410d-be4e-c65c9a903a38-horizon-secret-key\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.983807 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:23:34 crc kubenswrapper[4799]: I0319 20:23:34.996951 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6nxc\" (UniqueName: \"kubernetes.io/projected/d9b7bec9-2633-410d-be4e-c65c9a903a38-kube-api-access-j6nxc\") pod \"horizon-56454c8868-kxl79\" (UID: \"d9b7bec9-2633-410d-be4e-c65c9a903a38\") " pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:35 crc kubenswrapper[4799]: I0319 20:23:35.120440 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:23:36 crc kubenswrapper[4799]: I0319 20:23:36.387556 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:23:36 crc kubenswrapper[4799]: I0319 20:23:36.461647 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-hr57v"] Mar 19 20:23:36 crc kubenswrapper[4799]: I0319 20:23:36.461955 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" containerID="cri-o://fc53f21c29cf93cfa43596a855e36cb873eecf948b3252d5cdbbbc2309d21d27" gracePeriod=10 Mar 19 20:23:37 crc kubenswrapper[4799]: I0319 20:23:37.547286 4799 generic.go:334] "Generic (PLEG): container finished" podID="a2923901-dc23-464a-871f-85d66cebadb1" containerID="fc53f21c29cf93cfa43596a855e36cb873eecf948b3252d5cdbbbc2309d21d27" exitCode=0 Mar 19 20:23:37 crc kubenswrapper[4799]: I0319 20:23:37.547313 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" event={"ID":"a2923901-dc23-464a-871f-85d66cebadb1","Type":"ContainerDied","Data":"fc53f21c29cf93cfa43596a855e36cb873eecf948b3252d5cdbbbc2309d21d27"} Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.411880 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.531705 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvkwz\" (UniqueName: \"kubernetes.io/projected/ea453291-0fc9-4f73-8d5a-762b02101d81-kube-api-access-pvkwz\") pod \"ea453291-0fc9-4f73-8d5a-762b02101d81\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.531888 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-scripts\") pod \"ea453291-0fc9-4f73-8d5a-762b02101d81\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.531976 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-fernet-keys\") pod \"ea453291-0fc9-4f73-8d5a-762b02101d81\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.532020 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-credential-keys\") pod \"ea453291-0fc9-4f73-8d5a-762b02101d81\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.532098 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-combined-ca-bundle\") pod \"ea453291-0fc9-4f73-8d5a-762b02101d81\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.532129 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-config-data\") pod \"ea453291-0fc9-4f73-8d5a-762b02101d81\" (UID: \"ea453291-0fc9-4f73-8d5a-762b02101d81\") " Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.538672 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ea453291-0fc9-4f73-8d5a-762b02101d81" (UID: "ea453291-0fc9-4f73-8d5a-762b02101d81"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.538714 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-scripts" (OuterVolumeSpecName: "scripts") pod "ea453291-0fc9-4f73-8d5a-762b02101d81" (UID: "ea453291-0fc9-4f73-8d5a-762b02101d81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.540547 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea453291-0fc9-4f73-8d5a-762b02101d81-kube-api-access-pvkwz" (OuterVolumeSpecName: "kube-api-access-pvkwz") pod "ea453291-0fc9-4f73-8d5a-762b02101d81" (UID: "ea453291-0fc9-4f73-8d5a-762b02101d81"). InnerVolumeSpecName "kube-api-access-pvkwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.552923 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ea453291-0fc9-4f73-8d5a-762b02101d81" (UID: "ea453291-0fc9-4f73-8d5a-762b02101d81"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.566313 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q74bx" event={"ID":"ea453291-0fc9-4f73-8d5a-762b02101d81","Type":"ContainerDied","Data":"a279e379228fb5b893b9a2158ce1a82a02ae7c56da11c9aa81f30c0a053749cc"} Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.566354 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a279e379228fb5b893b9a2158ce1a82a02ae7c56da11c9aa81f30c0a053749cc" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.566404 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q74bx" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.569592 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea453291-0fc9-4f73-8d5a-762b02101d81" (UID: "ea453291-0fc9-4f73-8d5a-762b02101d81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.579436 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-config-data" (OuterVolumeSpecName: "config-data") pod "ea453291-0fc9-4f73-8d5a-762b02101d81" (UID: "ea453291-0fc9-4f73-8d5a-762b02101d81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.634259 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.634294 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvkwz\" (UniqueName: \"kubernetes.io/projected/ea453291-0fc9-4f73-8d5a-762b02101d81-kube-api-access-pvkwz\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.634303 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.634312 4799 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.634320 4799 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:38 crc kubenswrapper[4799]: I0319 20:23:38.634328 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea453291-0fc9-4f73-8d5a-762b02101d81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:39 crc kubenswrapper[4799]: E0319 20:23:39.013256 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea453291_0fc9_4f73_8d5a_762b02101d81.slice/crio-a279e379228fb5b893b9a2158ce1a82a02ae7c56da11c9aa81f30c0a053749cc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea453291_0fc9_4f73_8d5a_762b02101d81.slice\": RecentStats: unable to find data in memory cache]" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.510754 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q74bx"] Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.521961 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q74bx"] Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.609333 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8c9qh"] Mar 19 20:23:39 crc kubenswrapper[4799]: E0319 20:23:39.609805 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea453291-0fc9-4f73-8d5a-762b02101d81" containerName="keystone-bootstrap" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.609823 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea453291-0fc9-4f73-8d5a-762b02101d81" containerName="keystone-bootstrap" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.610054 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea453291-0fc9-4f73-8d5a-762b02101d81" containerName="keystone-bootstrap" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.610907 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.618243 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.618361 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vdqvk" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.618243 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.618595 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.618814 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.620201 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8c9qh"] Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.652217 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-fernet-keys\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.652309 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-config-data\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.652419 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-combined-ca-bundle\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.652452 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-credential-keys\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.652502 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtjmg\" (UniqueName: \"kubernetes.io/projected/b23ff57a-7365-4543-8a58-b9df4a3e52f4-kube-api-access-wtjmg\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.652528 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-scripts\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.753826 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-combined-ca-bundle\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.753864 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-credential-keys\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.753912 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtjmg\" (UniqueName: \"kubernetes.io/projected/b23ff57a-7365-4543-8a58-b9df4a3e52f4-kube-api-access-wtjmg\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.753932 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-scripts\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.753982 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-fernet-keys\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.754016 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-config-data\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.759415 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-config-data\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.759485 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-combined-ca-bundle\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.759691 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-fernet-keys\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.760613 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-scripts\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.774045 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-credential-keys\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.779414 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtjmg\" (UniqueName: \"kubernetes.io/projected/b23ff57a-7365-4543-8a58-b9df4a3e52f4-kube-api-access-wtjmg\") pod \"keystone-bootstrap-8c9qh\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:39 crc kubenswrapper[4799]: I0319 20:23:39.926366 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:23:40 crc kubenswrapper[4799]: I0319 20:23:40.450701 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 19 20:23:41 crc kubenswrapper[4799]: I0319 20:23:41.125964 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea453291-0fc9-4f73-8d5a-762b02101d81" path="/var/lib/kubelet/pods/ea453291-0fc9-4f73-8d5a-762b02101d81/volumes" Mar 19 20:23:44 crc kubenswrapper[4799]: I0319 20:23:44.628648 4799 generic.go:334] "Generic (PLEG): container finished" podID="901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" containerID="d9a73fe93c5da4a1b0141d12844c0ea46ffb7817596288a956a936e5b578eb9f" exitCode=0 Mar 19 20:23:44 crc kubenswrapper[4799]: I0319 20:23:44.628771 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mdm7f" event={"ID":"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6","Type":"ContainerDied","Data":"d9a73fe93c5da4a1b0141d12844c0ea46ffb7817596288a956a936e5b578eb9f"} Mar 19 20:23:45 crc kubenswrapper[4799]: I0319 20:23:45.450738 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 19 20:23:45 crc kubenswrapper[4799]: E0319 20:23:45.561787 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940" Mar 19 20:23:45 crc kubenswrapper[4799]: E0319 20:23:45.562053 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndchf9h575h556h5dbh568hd6h64bhb4h58bh668h647h696h589h9chf9hb7h646hdfhb8h84h8dhdch64ch65hdch56fh98hb7hfbhbchdbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jstgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5679479bf7-5r6zg_openstack(6b887e22-48a3-4d49-96d7-5788d8e69ef2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:23:45 crc kubenswrapper[4799]: E0319 20:23:45.572809 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940\\\"\"]" pod="openstack/horizon-5679479bf7-5r6zg" podUID="6b887e22-48a3-4d49-96d7-5788d8e69ef2" Mar 19 20:23:45 crc kubenswrapper[4799]: E0319 20:23:45.584314 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940" Mar 19 20:23:45 crc kubenswrapper[4799]: E0319 20:23:45.584613 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f4h5f8h8ch5cbh5b7h555h5h97h5ch5c5h699h64h599hfbh55dh5dfh84h654hf9h68ch679hf4hd6h7chc4hdbh564h59fhc4hcch97h57q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-729xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-665b847dc-rfzv9_openstack(9c8cb96f-24fe-46e8-a18f-2175ed50b782): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:23:45 crc kubenswrapper[4799]: E0319 20:23:45.587830 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940\\\"\"]" pod="openstack/horizon-665b847dc-rfzv9" podUID="9c8cb96f-24fe-46e8-a18f-2175ed50b782" Mar 19 20:23:50 crc kubenswrapper[4799]: I0319 20:23:50.450583 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 19 20:23:50 crc kubenswrapper[4799]: I0319 20:23:50.451187 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:52 crc kubenswrapper[4799]: I0319 20:23:52.966716 4799 scope.go:117] "RemoveContainer" containerID="a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25" Mar 19 20:23:52 crc kubenswrapper[4799]: E0319 20:23:52.979579 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940" Mar 19 20:23:52 crc kubenswrapper[4799]: E0319 20:23:52.979797 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n584h89h5cch54fh685h64dhb9h578h59h5fh68dh5cchf5hcdhf6hdh85hdfh74h58ch55dh589h5b7h57ch59bh595h86h8ch676h54h579h54q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z68vd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-566c4bf65-cxp7g_openstack(4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:23:52 crc kubenswrapper[4799]: E0319 20:23:52.987423 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:1d608ea76d94cfdd3ffdb8eca5f712a93be241813862ce74974d925087c5b940\\\"\"]" pod="openstack/horizon-566c4bf65-cxp7g" podUID="4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.080113 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226502 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-config-data\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226549 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226598 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-public-tls-certs\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226615 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-httpd-run\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226637 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxfdl\" (UniqueName: \"kubernetes.io/projected/2bac784a-8455-4061-a876-94e8b394a40d-kube-api-access-mxfdl\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226708 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-combined-ca-bundle\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226756 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-scripts\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.226775 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-logs\") pod \"2bac784a-8455-4061-a876-94e8b394a40d\" (UID: \"2bac784a-8455-4061-a876-94e8b394a40d\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.227031 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.227206 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-logs" (OuterVolumeSpecName: "logs") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.228564 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.228592 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bac784a-8455-4061-a876-94e8b394a40d-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.232013 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.234279 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bac784a-8455-4061-a876-94e8b394a40d-kube-api-access-mxfdl" (OuterVolumeSpecName: "kube-api-access-mxfdl") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "kube-api-access-mxfdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.236720 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-scripts" (OuterVolumeSpecName: "scripts") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.259016 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.285846 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.286684 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-config-data" (OuterVolumeSpecName: "config-data") pod "2bac784a-8455-4061-a876-94e8b394a40d" (UID: "2bac784a-8455-4061-a876-94e8b394a40d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.330468 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.330518 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.330533 4799 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.330547 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxfdl\" (UniqueName: \"kubernetes.io/projected/2bac784a-8455-4061-a876-94e8b394a40d-kube-api-access-mxfdl\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.330556 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.330564 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bac784a-8455-4061-a876-94e8b394a40d-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.348597 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.433488 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.506976 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.507215 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ktcs4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rcxnv_openstack(a40c3083-66e1-4fb7-b4b4-88fa2935cb49): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.508613 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rcxnv" podUID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.517868 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.538077 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.539994 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.637890 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b887e22-48a3-4d49-96d7-5788d8e69ef2-horizon-secret-key\") pod \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638004 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-config-data\") pod \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638102 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-config\") pod \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638209 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-scripts\") pod \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638242 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstgm\" (UniqueName: \"kubernetes.io/projected/6b887e22-48a3-4d49-96d7-5788d8e69ef2-kube-api-access-jstgm\") pod \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638323 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-combined-ca-bundle\") pod \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638414 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2q48\" (UniqueName: \"kubernetes.io/projected/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-kube-api-access-g2q48\") pod \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\" (UID: \"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638437 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b887e22-48a3-4d49-96d7-5788d8e69ef2-logs\") pod \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\" (UID: \"6b887e22-48a3-4d49-96d7-5788d8e69ef2\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.638612 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-config-data" (OuterVolumeSpecName: "config-data") pod "6b887e22-48a3-4d49-96d7-5788d8e69ef2" (UID: "6b887e22-48a3-4d49-96d7-5788d8e69ef2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.641429 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-scripts" (OuterVolumeSpecName: "scripts") pod "6b887e22-48a3-4d49-96d7-5788d8e69ef2" (UID: "6b887e22-48a3-4d49-96d7-5788d8e69ef2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.642177 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b887e22-48a3-4d49-96d7-5788d8e69ef2-logs" (OuterVolumeSpecName: "logs") pod "6b887e22-48a3-4d49-96d7-5788d8e69ef2" (UID: "6b887e22-48a3-4d49-96d7-5788d8e69ef2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.642534 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-kube-api-access-g2q48" (OuterVolumeSpecName: "kube-api-access-g2q48") pod "901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" (UID: "901bb6a6-df70-44d8-a3e7-b8de5d4b51d6"). InnerVolumeSpecName "kube-api-access-g2q48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.644010 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b887e22-48a3-4d49-96d7-5788d8e69ef2-kube-api-access-jstgm" (OuterVolumeSpecName: "kube-api-access-jstgm") pod "6b887e22-48a3-4d49-96d7-5788d8e69ef2" (UID: "6b887e22-48a3-4d49-96d7-5788d8e69ef2"). InnerVolumeSpecName "kube-api-access-jstgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.645297 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b887e22-48a3-4d49-96d7-5788d8e69ef2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6b887e22-48a3-4d49-96d7-5788d8e69ef2" (UID: "6b887e22-48a3-4d49-96d7-5788d8e69ef2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.661177 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-config" (OuterVolumeSpecName: "config") pod "901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" (UID: "901bb6a6-df70-44d8-a3e7-b8de5d4b51d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.674054 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" (UID: "901bb6a6-df70-44d8-a3e7-b8de5d4b51d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.720806 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5679479bf7-5r6zg" event={"ID":"6b887e22-48a3-4d49-96d7-5788d8e69ef2","Type":"ContainerDied","Data":"20b371064a8f6d7874aa5d1023fd3c3f1f9249bec68cfe811c6461b88d0ee00a"} Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.720848 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5679479bf7-5r6zg" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.722408 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bac784a-8455-4061-a876-94e8b394a40d","Type":"ContainerDied","Data":"7eec7926a14ba6c80aa4d3e55cf744428e1f48351b2d8132c16d2f35077a225d"} Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.722431 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.723892 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mdm7f" event={"ID":"901bb6a6-df70-44d8-a3e7-b8de5d4b51d6","Type":"ContainerDied","Data":"9ecceff605a0c72b41d027a6d3944306f4190f0cb06fa1f5e2a54d8b93ebbb5a"} Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.723938 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ecceff605a0c72b41d027a6d3944306f4190f0cb06fa1f5e2a54d8b93ebbb5a" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.723979 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mdm7f" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.727248 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665b847dc-rfzv9" event={"ID":"9c8cb96f-24fe-46e8-a18f-2175ed50b782","Type":"ContainerDied","Data":"dbae282bc1dd6b475b0147be6c7dedfc3351b312d3b0da6e8135bd1c0563dd97"} Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.727317 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665b847dc-rfzv9" Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.732256 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-rcxnv" podUID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.740541 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-scripts\") pod \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.740577 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729xh\" (UniqueName: \"kubernetes.io/projected/9c8cb96f-24fe-46e8-a18f-2175ed50b782-kube-api-access-729xh\") pod \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.740950 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8cb96f-24fe-46e8-a18f-2175ed50b782-horizon-secret-key\") pod \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.741242 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-scripts" (OuterVolumeSpecName: "scripts") pod "9c8cb96f-24fe-46e8-a18f-2175ed50b782" (UID: "9c8cb96f-24fe-46e8-a18f-2175ed50b782"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.741265 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-config-data\") pod \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.741325 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8cb96f-24fe-46e8-a18f-2175ed50b782-logs\") pod \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\" (UID: \"9c8cb96f-24fe-46e8-a18f-2175ed50b782\") " Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.741752 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-config-data" (OuterVolumeSpecName: "config-data") pod "9c8cb96f-24fe-46e8-a18f-2175ed50b782" (UID: "9c8cb96f-24fe-46e8-a18f-2175ed50b782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742239 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2q48\" (UniqueName: \"kubernetes.io/projected/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-kube-api-access-g2q48\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742259 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b887e22-48a3-4d49-96d7-5788d8e69ef2-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742241 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c8cb96f-24fe-46e8-a18f-2175ed50b782-logs" (OuterVolumeSpecName: "logs") pod "9c8cb96f-24fe-46e8-a18f-2175ed50b782" (UID: "9c8cb96f-24fe-46e8-a18f-2175ed50b782"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742376 4799 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b887e22-48a3-4d49-96d7-5788d8e69ef2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742402 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742412 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742421 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b887e22-48a3-4d49-96d7-5788d8e69ef2-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742430 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstgm\" (UniqueName: \"kubernetes.io/projected/6b887e22-48a3-4d49-96d7-5788d8e69ef2-kube-api-access-jstgm\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742439 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742448 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.742456 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cb96f-24fe-46e8-a18f-2175ed50b782-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.746478 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8cb96f-24fe-46e8-a18f-2175ed50b782-kube-api-access-729xh" (OuterVolumeSpecName: "kube-api-access-729xh") pod "9c8cb96f-24fe-46e8-a18f-2175ed50b782" (UID: "9c8cb96f-24fe-46e8-a18f-2175ed50b782"). InnerVolumeSpecName "kube-api-access-729xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.764830 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c8cb96f-24fe-46e8-a18f-2175ed50b782-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9c8cb96f-24fe-46e8-a18f-2175ed50b782" (UID: "9c8cb96f-24fe-46e8-a18f-2175ed50b782"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.791046 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.804034 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.815900 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.816285 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-log" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.816302 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-log" Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.816320 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" containerName="neutron-db-sync" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.816329 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" containerName="neutron-db-sync" Mar 19 20:23:53 crc kubenswrapper[4799]: E0319 20:23:53.816340 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-httpd" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.816347 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-httpd" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.816534 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-log" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.816550 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" containerName="neutron-db-sync" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.816561 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bac784a-8455-4061-a876-94e8b394a40d" containerName="glance-httpd" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.817424 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.823929 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.824174 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.829797 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5679479bf7-5r6zg"] Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.843743 4799 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c8cb96f-24fe-46e8-a18f-2175ed50b782-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.843775 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c8cb96f-24fe-46e8-a18f-2175ed50b782-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.843785 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729xh\" (UniqueName: \"kubernetes.io/projected/9c8cb96f-24fe-46e8-a18f-2175ed50b782-kube-api-access-729xh\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.864517 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5679479bf7-5r6zg"] Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.871221 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.946322 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.946363 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-scripts\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.946624 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.946765 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-logs\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.946812 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.949382 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/46527d99-7ba4-4e4e-baf3-77be33ab0460-kube-api-access-kqnlf\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.949547 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:53 crc kubenswrapper[4799]: I0319 20:23:53.949719 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-config-data\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051431 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/46527d99-7ba4-4e4e-baf3-77be33ab0460-kube-api-access-kqnlf\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051501 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051529 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-config-data\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051558 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051577 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-scripts\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051632 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051676 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-logs\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.051705 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.052478 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.052884 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.053642 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-logs\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.058368 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-scripts\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.069940 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.076859 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-config-data\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.079562 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.082032 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.088784 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/46527d99-7ba4-4e4e-baf3-77be33ab0460-kube-api-access-kqnlf\") pod \"glance-default-external-api-0\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.090719 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665b847dc-rfzv9"] Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.097188 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-665b847dc-rfzv9"] Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.137077 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:23:54 crc kubenswrapper[4799]: E0319 20:23:54.808383 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 19 20:23:54 crc kubenswrapper[4799]: E0319 20:23:54.808791 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q56wz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jnnwh_openstack(51a34e40-cb35-4589-9fcd-20130bd7831f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:23:54 crc kubenswrapper[4799]: E0319 20:23:54.810373 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jnnwh" podUID="51a34e40-cb35-4589-9fcd-20130bd7831f" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.820909 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-z9nhz"] Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.825582 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.851902 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-z9nhz"] Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.912091 4799 scope.go:117] "RemoveContainer" containerID="982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8" Mar 19 20:23:54 crc kubenswrapper[4799]: E0319 20:23:54.912872 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8\": container with ID starting with 982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8 not found: ID does not exist" containerID="982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.912904 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8"} err="failed to get container status \"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8\": rpc error: code = NotFound desc = could not find container \"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8\": container with ID starting with 982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8 not found: ID does not exist" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.912927 4799 scope.go:117] "RemoveContainer" containerID="a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25" Mar 19 20:23:54 crc kubenswrapper[4799]: E0319 20:23:54.913229 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25\": container with ID starting with a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25 not found: ID does not exist" containerID="a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.913279 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25"} err="failed to get container status \"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25\": rpc error: code = NotFound desc = could not find container \"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25\": container with ID starting with a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25 not found: ID does not exist" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.913293 4799 scope.go:117] "RemoveContainer" containerID="982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.913543 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8"} err="failed to get container status \"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8\": rpc error: code = NotFound desc = could not find container \"982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8\": container with ID starting with 982ddb9dfe1e050c8637aa7f27c52dfc9e0e1598dd932852be40e8b623b776b8 not found: ID does not exist" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.913561 4799 scope.go:117] "RemoveContainer" containerID="a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.913733 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25"} err="failed to get container status \"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25\": rpc error: code = NotFound desc = could not find container \"a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25\": container with ID starting with a70e40f7af363cd2659e0c3c35bd3816fddaf571912ad1e2157395fea07fbe25 not found: ID does not exist" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.913750 4799 scope.go:117] "RemoveContainer" containerID="570dc7e6aa98272f82f0e2f81d8147cb1c80531ae358e206d2a01bb340be4d32" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.943651 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-699d85bbdd-vfn2t"] Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.945318 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.947487 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.947866 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.949223 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 20:23:54 crc kubenswrapper[4799]: I0319 20:23:54.951323 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6ln6j" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.002813 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffkm\" (UniqueName: \"kubernetes.io/projected/794e2fbb-5a64-4bc9-b25a-9041256b23ea-kube-api-access-kffkm\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.002884 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-combined-ca-bundle\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.002959 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-config\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003037 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-ovndb-tls-certs\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003072 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003097 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2l2c\" (UniqueName: \"kubernetes.io/projected/003320d3-b74f-4b94-9636-6b468817f9f1-kube-api-access-d2l2c\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003270 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-svc\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003295 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003328 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.003435 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-config\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.008658 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-699d85bbdd-vfn2t"] Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.015446 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-httpd-config\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.128543 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-svc\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.128829 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.128857 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-config\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129284 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-httpd-config\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129308 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kffkm\" (UniqueName: \"kubernetes.io/projected/794e2fbb-5a64-4bc9-b25a-9041256b23ea-kube-api-access-kffkm\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129328 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-combined-ca-bundle\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129376 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-config\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129481 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-ovndb-tls-certs\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129508 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.129528 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2l2c\" (UniqueName: \"kubernetes.io/projected/003320d3-b74f-4b94-9636-6b468817f9f1-kube-api-access-d2l2c\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.131931 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-svc\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.136284 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.137050 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bac784a-8455-4061-a876-94e8b394a40d" path="/var/lib/kubelet/pods/2bac784a-8455-4061-a876-94e8b394a40d/volumes" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.138504 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.139423 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-combined-ca-bundle\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.139727 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b887e22-48a3-4d49-96d7-5788d8e69ef2" path="/var/lib/kubelet/pods/6b887e22-48a3-4d49-96d7-5788d8e69ef2/volumes" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.140194 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8cb96f-24fe-46e8-a18f-2175ed50b782" path="/var/lib/kubelet/pods/9c8cb96f-24fe-46e8-a18f-2175ed50b782/volumes" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.143417 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-httpd-config\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.144307 4799 scope.go:117] "RemoveContainer" containerID="41fa60a48b600cee8524ec84a16debdd0e42ec1624f23a612830ca6113a95fb0" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.145207 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.145413 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-config\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.145830 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.148801 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.153450 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-ovndb-tls-certs\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.154780 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2l2c\" (UniqueName: \"kubernetes.io/projected/003320d3-b74f-4b94-9636-6b468817f9f1-kube-api-access-d2l2c\") pod \"dnsmasq-dns-5f9bff4fdf-z9nhz\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.160982 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-config\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.161632 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffkm\" (UniqueName: \"kubernetes.io/projected/794e2fbb-5a64-4bc9-b25a-9041256b23ea-kube-api-access-kffkm\") pod \"neutron-699d85bbdd-vfn2t\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.235290 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-scripts\") pod \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.235845 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-scripts" (OuterVolumeSpecName: "scripts") pod "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" (UID: "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236154 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-logs\") pod \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236212 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-svc\") pod \"a2923901-dc23-464a-871f-85d66cebadb1\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236275 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-horizon-secret-key\") pod \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236374 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-nb\") pod \"a2923901-dc23-464a-871f-85d66cebadb1\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236429 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-swift-storage-0\") pod \"a2923901-dc23-464a-871f-85d66cebadb1\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236449 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-config\") pod \"a2923901-dc23-464a-871f-85d66cebadb1\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236465 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z68vd\" (UniqueName: \"kubernetes.io/projected/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-kube-api-access-z68vd\") pod \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236481 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-config-data\") pod \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\" (UID: \"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236509 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-sb\") pod \"a2923901-dc23-464a-871f-85d66cebadb1\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236525 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd9cp\" (UniqueName: \"kubernetes.io/projected/a2923901-dc23-464a-871f-85d66cebadb1-kube-api-access-zd9cp\") pod \"a2923901-dc23-464a-871f-85d66cebadb1\" (UID: \"a2923901-dc23-464a-871f-85d66cebadb1\") " Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.236722 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-logs" (OuterVolumeSpecName: "logs") pod "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" (UID: "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.240170 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-config-data" (OuterVolumeSpecName: "config-data") pod "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" (UID: "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.259611 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2923901-dc23-464a-871f-85d66cebadb1-kube-api-access-zd9cp" (OuterVolumeSpecName: "kube-api-access-zd9cp") pod "a2923901-dc23-464a-871f-85d66cebadb1" (UID: "a2923901-dc23-464a-871f-85d66cebadb1"). InnerVolumeSpecName "kube-api-access-zd9cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.262265 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.262318 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd9cp\" (UniqueName: \"kubernetes.io/projected/a2923901-dc23-464a-871f-85d66cebadb1-kube-api-access-zd9cp\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.262329 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.262338 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.267766 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.285136 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" (UID: "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.288517 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-kube-api-access-z68vd" (OuterVolumeSpecName: "kube-api-access-z68vd") pod "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" (UID: "4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4"). InnerVolumeSpecName "kube-api-access-z68vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.363857 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z68vd\" (UniqueName: \"kubernetes.io/projected/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-kube-api-access-z68vd\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.363888 4799 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.410958 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a2923901-dc23-464a-871f-85d66cebadb1" (UID: "a2923901-dc23-464a-871f-85d66cebadb1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.414281 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2923901-dc23-464a-871f-85d66cebadb1" (UID: "a2923901-dc23-464a-871f-85d66cebadb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.421573 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.430788 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2923901-dc23-464a-871f-85d66cebadb1" (UID: "a2923901-dc23-464a-871f-85d66cebadb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.439878 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2923901-dc23-464a-871f-85d66cebadb1" (UID: "a2923901-dc23-464a-871f-85d66cebadb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.446136 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-config" (OuterVolumeSpecName: "config") pod "a2923901-dc23-464a-871f-85d66cebadb1" (UID: "a2923901-dc23-464a-871f-85d66cebadb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.465660 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.465684 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.465698 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.465711 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.465722 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2923901-dc23-464a-871f-85d66cebadb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.752861 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" event={"ID":"a2923901-dc23-464a-871f-85d66cebadb1","Type":"ContainerDied","Data":"c92660e1be29e6fe60eb2882b3dfa9b55f442444d54fa1316fbd7e2e37f5fca1"} Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.752911 4799 scope.go:117] "RemoveContainer" containerID="fc53f21c29cf93cfa43596a855e36cb873eecf948b3252d5cdbbbc2309d21d27" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.753009 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-hr57v" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.760897 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s6gmj" event={"ID":"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd","Type":"ContainerStarted","Data":"1445ff62688485c809fec8edc4c9dd52aee914ac649fe3ffccb276995babf78b"} Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.762913 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566c4bf65-cxp7g" event={"ID":"4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4","Type":"ContainerDied","Data":"d205baf5c30c80723fad860d7a18e7fb524edbb31a6a7f8df6c09d3e74a4a351"} Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.762997 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566c4bf65-cxp7g" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.778164 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerStarted","Data":"9d014abe28c600f1b9544ac596c5d4d8e3713e7df4e2e0f510a846ecd8b92231"} Mar 19 20:23:55 crc kubenswrapper[4799]: E0319 20:23:55.780711 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-jnnwh" podUID="51a34e40-cb35-4589-9fcd-20130bd7831f" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.790799 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-s6gmj" podStartSLOduration=4.6402748129999996 podStartE2EDuration="30.790776961s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="2026-03-19 20:23:27.342395059 +0000 UTC m=+1084.948348131" lastFinishedPulling="2026-03-19 20:23:53.492897207 +0000 UTC m=+1111.098850279" observedRunningTime="2026-03-19 20:23:55.781467246 +0000 UTC m=+1113.387420318" watchObservedRunningTime="2026-03-19 20:23:55.790776961 +0000 UTC m=+1113.396730033" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.827263 4799 scope.go:117] "RemoveContainer" containerID="ae2e6cd8f1c327fd69f6d8cbd30bc5dee7e83ab32533d4d5b31fccf0a59a0ae7" Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.874065 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-hr57v"] Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.898532 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-hr57v"] Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.911746 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8869c89f8-jvpbt"] Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.925335 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8c9qh"] Mar 19 20:23:55 crc kubenswrapper[4799]: W0319 20:23:55.982641 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod003320d3_b74f_4b94_9636_6b468817f9f1.slice/crio-7965a5a272114b45f75262f4457b1980c97209ee3c4af91f2c2e62cbff2ccb95 WatchSource:0}: Error finding container 7965a5a272114b45f75262f4457b1980c97209ee3c4af91f2c2e62cbff2ccb95: Status 404 returned error can't find the container with id 7965a5a272114b45f75262f4457b1980c97209ee3c4af91f2c2e62cbff2ccb95 Mar 19 20:23:55 crc kubenswrapper[4799]: I0319 20:23:55.989664 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56454c8868-kxl79"] Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.011933 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-566c4bf65-cxp7g"] Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.026693 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-566c4bf65-cxp7g"] Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.034616 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.042247 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-z9nhz"] Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.053616 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-699d85bbdd-vfn2t"] Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.840281 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699d85bbdd-vfn2t" event={"ID":"794e2fbb-5a64-4bc9-b25a-9041256b23ea","Type":"ContainerStarted","Data":"20c0185206eb3196343f35f5c5920f7734a7afa5c139e36eb4292590b3552508"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.841999 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699d85bbdd-vfn2t" event={"ID":"794e2fbb-5a64-4bc9-b25a-9041256b23ea","Type":"ContainerStarted","Data":"9a11f9f5a9fa98125b8a45799e58a65178cf6d659d9caac7412dd20c13cc0a97"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.842024 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699d85bbdd-vfn2t" event={"ID":"794e2fbb-5a64-4bc9-b25a-9041256b23ea","Type":"ContainerStarted","Data":"4e563b25a31661852ef45e42c086ca84e78ae10096ac6ffce5a3972184de05b6"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.842054 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.883387 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-699d85bbdd-vfn2t" podStartSLOduration=2.883360512 podStartE2EDuration="2.883360512s" podCreationTimestamp="2026-03-19 20:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:56.876611577 +0000 UTC m=+1114.482564659" watchObservedRunningTime="2026-03-19 20:23:56.883360512 +0000 UTC m=+1114.489313584" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.911881 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8c9qh" event={"ID":"b23ff57a-7365-4543-8a58-b9df4a3e52f4","Type":"ContainerStarted","Data":"2b28f11163b433e88869f33aef128502e7a4e9631d4b643afd00365ecad9b182"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.911924 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8c9qh" event={"ID":"b23ff57a-7365-4543-8a58-b9df4a3e52f4","Type":"ContainerStarted","Data":"23a0331a1d1745f43054e46770decda40d2b371d2f072f3074923b84644da715"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.933175 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46527d99-7ba4-4e4e-baf3-77be33ab0460","Type":"ContainerStarted","Data":"71c0fca97b439f69dcf9e6636cbdf117b255ba609995991f274e83aeb2c785ef"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.933228 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46527d99-7ba4-4e4e-baf3-77be33ab0460","Type":"ContainerStarted","Data":"908ce2130866427936e02644c2930d27d61dfc7add84d359df6d62f650a394e6"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.952123 4799 generic.go:334] "Generic (PLEG): container finished" podID="003320d3-b74f-4b94-9636-6b468817f9f1" containerID="341636d22e92d1204032f55e431e97c243b62c7cdb38bde6d518ec898a4c1ced" exitCode=0 Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.952323 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" event={"ID":"003320d3-b74f-4b94-9636-6b468817f9f1","Type":"ContainerDied","Data":"341636d22e92d1204032f55e431e97c243b62c7cdb38bde6d518ec898a4c1ced"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.952418 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" event={"ID":"003320d3-b74f-4b94-9636-6b468817f9f1","Type":"ContainerStarted","Data":"7965a5a272114b45f75262f4457b1980c97209ee3c4af91f2c2e62cbff2ccb95"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.961538 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56454c8868-kxl79" event={"ID":"d9b7bec9-2633-410d-be4e-c65c9a903a38","Type":"ContainerStarted","Data":"c77081b46516021e31370850fd003afc6d38d5d70940d38cc057af578cb0c954"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.961587 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56454c8868-kxl79" event={"ID":"d9b7bec9-2633-410d-be4e-c65c9a903a38","Type":"ContainerStarted","Data":"d861e215fc733ea75e43e18782a0db537521c36f23f8aba7d82ae3fb747279fa"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.966121 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79858f4d8f-lj6r6"] Mar 19 20:23:56 crc kubenswrapper[4799]: E0319 20:23:56.966606 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.966619 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" Mar 19 20:23:56 crc kubenswrapper[4799]: E0319 20:23:56.966631 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="init" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.966637 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="init" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.966829 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2923901-dc23-464a-871f-85d66cebadb1" containerName="dnsmasq-dns" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.967062 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8c9qh" podStartSLOduration=17.967047191 podStartE2EDuration="17.967047191s" podCreationTimestamp="2026-03-19 20:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:56.935485388 +0000 UTC m=+1114.541438460" watchObservedRunningTime="2026-03-19 20:23:56.967047191 +0000 UTC m=+1114.573000263" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.968158 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.975171 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.975322 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.999322 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8869c89f8-jvpbt" event={"ID":"ecf2a634-2499-4ea6-853f-9c8852d65e01","Type":"ContainerStarted","Data":"ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.999366 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8869c89f8-jvpbt" event={"ID":"ecf2a634-2499-4ea6-853f-9c8852d65e01","Type":"ContainerStarted","Data":"0d5befac045ea7bc35cc692c6786017987a9ea930b1eb74042dccde4dee53355"} Mar 19 20:23:56 crc kubenswrapper[4799]: I0319 20:23:56.999382 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.007513 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79858f4d8f-lj6r6"] Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.048586 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8869c89f8-jvpbt" podStartSLOduration=22.551553374 podStartE2EDuration="23.048563551s" podCreationTimestamp="2026-03-19 20:23:34 +0000 UTC" firstStartedPulling="2026-03-19 20:23:55.860274802 +0000 UTC m=+1113.466227874" lastFinishedPulling="2026-03-19 20:23:56.357284979 +0000 UTC m=+1113.963238051" observedRunningTime="2026-03-19 20:23:57.045036235 +0000 UTC m=+1114.650989327" watchObservedRunningTime="2026-03-19 20:23:57.048563551 +0000 UTC m=+1114.654516623" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.104911 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-combined-ca-bundle\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.105152 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-public-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.105228 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-httpd-config\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.105399 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-config\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.105484 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbzkt\" (UniqueName: \"kubernetes.io/projected/bec976b7-ee8e-46ca-bc02-c47336ee303b-kube-api-access-wbzkt\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.105514 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-internal-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.105538 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-ovndb-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.144654 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4" path="/var/lib/kubelet/pods/4bc06e29-fb9b-4d1e-9b01-b9c79ddf1fe4/volumes" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.145544 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2923901-dc23-464a-871f-85d66cebadb1" path="/var/lib/kubelet/pods/a2923901-dc23-464a-871f-85d66cebadb1/volumes" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207095 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-ovndb-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207169 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-combined-ca-bundle\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207240 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-public-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207262 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-httpd-config\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207290 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-config\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207328 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbzkt\" (UniqueName: \"kubernetes.io/projected/bec976b7-ee8e-46ca-bc02-c47336ee303b-kube-api-access-wbzkt\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.207351 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-internal-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.214510 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-config\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.214710 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-internal-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.214780 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-combined-ca-bundle\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.215999 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-public-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.220339 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-ovndb-tls-certs\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.224048 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-httpd-config\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.225203 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbzkt\" (UniqueName: \"kubernetes.io/projected/bec976b7-ee8e-46ca-bc02-c47336ee303b-kube-api-access-wbzkt\") pod \"neutron-79858f4d8f-lj6r6\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:57 crc kubenswrapper[4799]: I0319 20:23:57.478793 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.016254 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8869c89f8-jvpbt" event={"ID":"ecf2a634-2499-4ea6-853f-9c8852d65e01","Type":"ContainerStarted","Data":"132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612"} Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.035547 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46527d99-7ba4-4e4e-baf3-77be33ab0460","Type":"ContainerStarted","Data":"75acd7be3a759b0800e888157f520500d7460087d231ed76271305b389800abf"} Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.042421 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" event={"ID":"003320d3-b74f-4b94-9636-6b468817f9f1","Type":"ContainerStarted","Data":"4833337c696fc7e189b49a5a66bd734547c7528a173f2d684778e1bdab3ab4e8"} Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.047731 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46507e9b-c305-48e4-9a5a-84cb3fd1e695","Type":"ContainerStarted","Data":"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675"} Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.047758 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46507e9b-c305-48e4-9a5a-84cb3fd1e695","Type":"ContainerStarted","Data":"94fed15ac560e271fa3ed65273918911c0c681767db5e1e494fbacb3ab0487f9"} Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.049602 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56454c8868-kxl79" event={"ID":"d9b7bec9-2633-410d-be4e-c65c9a903a38","Type":"ContainerStarted","Data":"01d3834c53e9b3f2b52c01742db7c847216a137cd1ab11d131614487403a5938"} Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.054955 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.054943474 podStartE2EDuration="5.054943474s" podCreationTimestamp="2026-03-19 20:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:58.052675842 +0000 UTC m=+1115.658628914" watchObservedRunningTime="2026-03-19 20:23:58.054943474 +0000 UTC m=+1115.660896546" Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.081544 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56454c8868-kxl79" podStartSLOduration=23.477968329 podStartE2EDuration="24.081529521s" podCreationTimestamp="2026-03-19 20:23:34 +0000 UTC" firstStartedPulling="2026-03-19 20:23:55.901760297 +0000 UTC m=+1113.507713369" lastFinishedPulling="2026-03-19 20:23:56.505321489 +0000 UTC m=+1114.111274561" observedRunningTime="2026-03-19 20:23:58.076843613 +0000 UTC m=+1115.682796675" watchObservedRunningTime="2026-03-19 20:23:58.081529521 +0000 UTC m=+1115.687482583" Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.392911 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79858f4d8f-lj6r6"] Mar 19 20:23:58 crc kubenswrapper[4799]: W0319 20:23:58.394463 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbec976b7_ee8e_46ca_bc02_c47336ee303b.slice/crio-6eb648c29eae0ea5c2c54ef0f9e9136a0e6c5a96f2c4fb226c46fba12746d11e WatchSource:0}: Error finding container 6eb648c29eae0ea5c2c54ef0f9e9136a0e6c5a96f2c4fb226c46fba12746d11e: Status 404 returned error can't find the container with id 6eb648c29eae0ea5c2c54ef0f9e9136a0e6c5a96f2c4fb226c46fba12746d11e Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.756239 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:23:58 crc kubenswrapper[4799]: I0319 20:23:58.756770 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.058708 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79858f4d8f-lj6r6" event={"ID":"bec976b7-ee8e-46ca-bc02-c47336ee303b","Type":"ContainerStarted","Data":"88c9011868828ff91e6dc46d7ff538d531a509ae0fe0fd8b03f81917a3ba7333"} Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.058991 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.059004 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79858f4d8f-lj6r6" event={"ID":"bec976b7-ee8e-46ca-bc02-c47336ee303b","Type":"ContainerStarted","Data":"3477927a0ef32b69a01e92568fc7afdb648c24c81e3a7b27ab7bdf39cd3e8b8d"} Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.059014 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79858f4d8f-lj6r6" event={"ID":"bec976b7-ee8e-46ca-bc02-c47336ee303b","Type":"ContainerStarted","Data":"6eb648c29eae0ea5c2c54ef0f9e9136a0e6c5a96f2c4fb226c46fba12746d11e"} Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.061844 4799 generic.go:334] "Generic (PLEG): container finished" podID="e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" containerID="1445ff62688485c809fec8edc4c9dd52aee914ac649fe3ffccb276995babf78b" exitCode=0 Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.061907 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s6gmj" event={"ID":"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd","Type":"ContainerDied","Data":"1445ff62688485c809fec8edc4c9dd52aee914ac649fe3ffccb276995babf78b"} Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.064009 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerStarted","Data":"d6d021779bbd46cc8379d28b90aee35c6c7cf1ef2db3cf0d06c8f6533ed7fdc2"} Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.066166 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46507e9b-c305-48e4-9a5a-84cb3fd1e695","Type":"ContainerStarted","Data":"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1"} Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.066232 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-httpd" containerID="cri-o://6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1" gracePeriod=30 Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.066209 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-log" containerID="cri-o://2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675" gracePeriod=30 Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.114672 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79858f4d8f-lj6r6" podStartSLOduration=3.114648095 podStartE2EDuration="3.114648095s" podCreationTimestamp="2026-03-19 20:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:59.08085538 +0000 UTC m=+1116.686808452" watchObservedRunningTime="2026-03-19 20:23:59.114648095 +0000 UTC m=+1116.720601167" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.144966 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=28.144944984 podStartE2EDuration="28.144944984s" podCreationTimestamp="2026-03-19 20:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:59.138317452 +0000 UTC m=+1116.744270524" watchObservedRunningTime="2026-03-19 20:23:59.144944984 +0000 UTC m=+1116.750898056" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.158164 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" podStartSLOduration=5.158149185 podStartE2EDuration="5.158149185s" podCreationTimestamp="2026-03-19 20:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:23:59.152197322 +0000 UTC m=+1116.758150394" watchObservedRunningTime="2026-03-19 20:23:59.158149185 +0000 UTC m=+1116.764102257" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.731862 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.765987 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-logs\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766055 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-combined-ca-bundle\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766111 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-httpd-run\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766125 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-config-data\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766143 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766262 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4qxg\" (UniqueName: \"kubernetes.io/projected/46507e9b-c305-48e4-9a5a-84cb3fd1e695-kube-api-access-j4qxg\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766280 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-scripts\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.766317 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-internal-tls-certs\") pod \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\" (UID: \"46507e9b-c305-48e4-9a5a-84cb3fd1e695\") " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.767439 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.770101 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-logs" (OuterVolumeSpecName: "logs") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.774826 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46507e9b-c305-48e4-9a5a-84cb3fd1e695-kube-api-access-j4qxg" (OuterVolumeSpecName: "kube-api-access-j4qxg") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "kube-api-access-j4qxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.775742 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-scripts" (OuterVolumeSpecName: "scripts") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.790545 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.830921 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.843434 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-config-data" (OuterVolumeSpecName: "config-data") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.862967 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "46507e9b-c305-48e4-9a5a-84cb3fd1e695" (UID: "46507e9b-c305-48e4-9a5a-84cb3fd1e695"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868152 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868182 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868217 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868228 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4qxg\" (UniqueName: \"kubernetes.io/projected/46507e9b-c305-48e4-9a5a-84cb3fd1e695-kube-api-access-j4qxg\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868239 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868247 4799 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868255 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46507e9b-c305-48e4-9a5a-84cb3fd1e695-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.868262 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46507e9b-c305-48e4-9a5a-84cb3fd1e695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.887077 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 19 20:23:59 crc kubenswrapper[4799]: I0319 20:23:59.970780 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.080211 4799 generic.go:334] "Generic (PLEG): container finished" podID="b23ff57a-7365-4543-8a58-b9df4a3e52f4" containerID="2b28f11163b433e88869f33aef128502e7a4e9631d4b643afd00365ecad9b182" exitCode=0 Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.080281 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8c9qh" event={"ID":"b23ff57a-7365-4543-8a58-b9df4a3e52f4","Type":"ContainerDied","Data":"2b28f11163b433e88869f33aef128502e7a4e9631d4b643afd00365ecad9b182"} Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083043 4799 generic.go:334] "Generic (PLEG): container finished" podID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerID="6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1" exitCode=0 Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083075 4799 generic.go:334] "Generic (PLEG): container finished" podID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerID="2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675" exitCode=143 Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083130 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083192 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46507e9b-c305-48e4-9a5a-84cb3fd1e695","Type":"ContainerDied","Data":"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1"} Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083260 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46507e9b-c305-48e4-9a5a-84cb3fd1e695","Type":"ContainerDied","Data":"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675"} Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083271 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46507e9b-c305-48e4-9a5a-84cb3fd1e695","Type":"ContainerDied","Data":"94fed15ac560e271fa3ed65273918911c0c681767db5e1e494fbacb3ab0487f9"} Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.083289 4799 scope.go:117] "RemoveContainer" containerID="6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.114355 4799 scope.go:117] "RemoveContainer" containerID="2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.132339 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.155487 4799 scope.go:117] "RemoveContainer" containerID="6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.155590 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565864-nkqck"] Mar 19 20:24:00 crc kubenswrapper[4799]: E0319 20:24:00.155903 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-httpd" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.155920 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-httpd" Mar 19 20:24:00 crc kubenswrapper[4799]: E0319 20:24:00.155948 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-log" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.155955 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-log" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.156098 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-httpd" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.156115 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" containerName="glance-log" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.156629 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.161990 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.162057 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.162318 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.168414 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:00 crc kubenswrapper[4799]: E0319 20:24:00.170865 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1\": container with ID starting with 6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1 not found: ID does not exist" containerID="6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.170897 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1"} err="failed to get container status \"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1\": rpc error: code = NotFound desc = could not find container \"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1\": container with ID starting with 6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1 not found: ID does not exist" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.170921 4799 scope.go:117] "RemoveContainer" containerID="2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675" Mar 19 20:24:00 crc kubenswrapper[4799]: E0319 20:24:00.171238 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675\": container with ID starting with 2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675 not found: ID does not exist" containerID="2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.171259 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675"} err="failed to get container status \"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675\": rpc error: code = NotFound desc = could not find container \"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675\": container with ID starting with 2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675 not found: ID does not exist" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.171270 4799 scope.go:117] "RemoveContainer" containerID="6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.171811 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1"} err="failed to get container status \"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1\": rpc error: code = NotFound desc = could not find container \"6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1\": container with ID starting with 6b37995f4c401be8c3ddbd6ab86dcd2cfc737e92b13995e8769c04c6f852a0f1 not found: ID does not exist" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.171839 4799 scope.go:117] "RemoveContainer" containerID="2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.172029 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675"} err="failed to get container status \"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675\": rpc error: code = NotFound desc = could not find container \"2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675\": container with ID starting with 2e4497f90f1a0e144184e46cf9d8a1b3f2a8578dfef6632d3b41666c7b3b9675 not found: ID does not exist" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.178994 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-nkqck"] Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.190495 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.191962 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.195083 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.195308 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.200160 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275461 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnq4\" (UniqueName: \"kubernetes.io/projected/28f8c9fc-4f87-4da0-aa76-908eb7e729d6-kube-api-access-wwnq4\") pod \"auto-csr-approver-29565864-nkqck\" (UID: \"28f8c9fc-4f87-4da0-aa76-908eb7e729d6\") " pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275506 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275543 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275570 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxh5\" (UniqueName: \"kubernetes.io/projected/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-kube-api-access-fwxh5\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275602 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275621 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275645 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275669 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.275710 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377268 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377667 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377720 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377768 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnq4\" (UniqueName: \"kubernetes.io/projected/28f8c9fc-4f87-4da0-aa76-908eb7e729d6-kube-api-access-wwnq4\") pod \"auto-csr-approver-29565864-nkqck\" (UID: \"28f8c9fc-4f87-4da0-aa76-908eb7e729d6\") " pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377794 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377825 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377854 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxh5\" (UniqueName: \"kubernetes.io/projected/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-kube-api-access-fwxh5\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377887 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.377911 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.381061 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.381085 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.381322 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.384224 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.397179 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.398807 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnq4\" (UniqueName: \"kubernetes.io/projected/28f8c9fc-4f87-4da0-aa76-908eb7e729d6-kube-api-access-wwnq4\") pod \"auto-csr-approver-29565864-nkqck\" (UID: \"28f8c9fc-4f87-4da0-aa76-908eb7e729d6\") " pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.400879 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxh5\" (UniqueName: \"kubernetes.io/projected/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-kube-api-access-fwxh5\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.401001 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.422508 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.442918 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.444069 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.490633 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.514842 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.519884 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s6gmj" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.581216 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxgvf\" (UniqueName: \"kubernetes.io/projected/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-kube-api-access-sxgvf\") pod \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.581324 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-scripts\") pod \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.581370 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-combined-ca-bundle\") pod \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.581410 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-config-data\") pod \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.581519 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-logs\") pod \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\" (UID: \"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd\") " Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.582015 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-logs" (OuterVolumeSpecName: "logs") pod "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" (UID: "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.585183 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-scripts" (OuterVolumeSpecName: "scripts") pod "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" (UID: "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.587299 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-kube-api-access-sxgvf" (OuterVolumeSpecName: "kube-api-access-sxgvf") pod "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" (UID: "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd"). InnerVolumeSpecName "kube-api-access-sxgvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.608909 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-config-data" (OuterVolumeSpecName: "config-data") pod "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" (UID: "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.611298 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" (UID: "e0e1253c-8230-46c8-b0b7-3b34a42ae0bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.683399 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.683730 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.683743 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.683752 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:00 crc kubenswrapper[4799]: I0319 20:24:00.683762 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxgvf\" (UniqueName: \"kubernetes.io/projected/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd-kube-api-access-sxgvf\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.047263 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-nkqck"] Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.112931 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-nkqck" event={"ID":"28f8c9fc-4f87-4da0-aa76-908eb7e729d6","Type":"ContainerStarted","Data":"69819e79455c3c60cff7850910f87664e4f736f0271ee61f274358906a37c755"} Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.115334 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s6gmj" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.115454 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s6gmj" event={"ID":"e0e1253c-8230-46c8-b0b7-3b34a42ae0bd","Type":"ContainerDied","Data":"878ae09b8a2a3df9edd6e2f5c3c4ef254ad58d40ef4f19a190d70f30580aea68"} Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.115509 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="878ae09b8a2a3df9edd6e2f5c3c4ef254ad58d40ef4f19a190d70f30580aea68" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.150081 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46507e9b-c305-48e4-9a5a-84cb3fd1e695" path="/var/lib/kubelet/pods/46507e9b-c305-48e4-9a5a-84cb3fd1e695/volumes" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.218257 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.314116 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-787b8d7874-lck4d"] Mar 19 20:24:01 crc kubenswrapper[4799]: E0319 20:24:01.314471 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" containerName="placement-db-sync" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.314487 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" containerName="placement-db-sync" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.314662 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" containerName="placement-db-sync" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.315525 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.322370 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.322450 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.322834 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.324510 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2875p" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.324686 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.326703 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-787b8d7874-lck4d"] Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406267 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-public-tls-certs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406627 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-combined-ca-bundle\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406735 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-scripts\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406767 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-internal-tls-certs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406800 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-config-data\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406818 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810c2730-0702-4ff9-b62e-74c6bc564149-logs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.406905 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrjrv\" (UniqueName: \"kubernetes.io/projected/810c2730-0702-4ff9-b62e-74c6bc564149-kube-api-access-xrjrv\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.496124 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.507476 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-scripts\") pod \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.507622 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-credential-keys\") pod \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.507667 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-combined-ca-bundle\") pod \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.507743 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtjmg\" (UniqueName: \"kubernetes.io/projected/b23ff57a-7365-4543-8a58-b9df4a3e52f4-kube-api-access-wtjmg\") pod \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.507811 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-fernet-keys\") pod \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.507834 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-config-data\") pod \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\" (UID: \"b23ff57a-7365-4543-8a58-b9df4a3e52f4\") " Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508049 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-combined-ca-bundle\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508109 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-scripts\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508126 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-internal-tls-certs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508151 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-config-data\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508170 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810c2730-0702-4ff9-b62e-74c6bc564149-logs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508213 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrjrv\" (UniqueName: \"kubernetes.io/projected/810c2730-0702-4ff9-b62e-74c6bc564149-kube-api-access-xrjrv\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.508238 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-public-tls-certs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.513507 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810c2730-0702-4ff9-b62e-74c6bc564149-logs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.516066 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-public-tls-certs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.516240 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-config-data\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.518654 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-scripts" (OuterVolumeSpecName: "scripts") pod "b23ff57a-7365-4543-8a58-b9df4a3e52f4" (UID: "b23ff57a-7365-4543-8a58-b9df4a3e52f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.519128 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b23ff57a-7365-4543-8a58-b9df4a3e52f4" (UID: "b23ff57a-7365-4543-8a58-b9df4a3e52f4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.521210 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b23ff57a-7365-4543-8a58-b9df4a3e52f4-kube-api-access-wtjmg" (OuterVolumeSpecName: "kube-api-access-wtjmg") pod "b23ff57a-7365-4543-8a58-b9df4a3e52f4" (UID: "b23ff57a-7365-4543-8a58-b9df4a3e52f4"). InnerVolumeSpecName "kube-api-access-wtjmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.524861 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b23ff57a-7365-4543-8a58-b9df4a3e52f4" (UID: "b23ff57a-7365-4543-8a58-b9df4a3e52f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.524961 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-combined-ca-bundle\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.526247 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-internal-tls-certs\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.532605 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-scripts\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.545632 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrjrv\" (UniqueName: \"kubernetes.io/projected/810c2730-0702-4ff9-b62e-74c6bc564149-kube-api-access-xrjrv\") pod \"placement-787b8d7874-lck4d\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.551911 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b23ff57a-7365-4543-8a58-b9df4a3e52f4" (UID: "b23ff57a-7365-4543-8a58-b9df4a3e52f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.570602 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-config-data" (OuterVolumeSpecName: "config-data") pod "b23ff57a-7365-4543-8a58-b9df4a3e52f4" (UID: "b23ff57a-7365-4543-8a58-b9df4a3e52f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.609807 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtjmg\" (UniqueName: \"kubernetes.io/projected/b23ff57a-7365-4543-8a58-b9df4a3e52f4-kube-api-access-wtjmg\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.609843 4799 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.609853 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.609861 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.609871 4799 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.609879 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23ff57a-7365-4543-8a58-b9df4a3e52f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:01 crc kubenswrapper[4799]: I0319 20:24:01.704578 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.135564 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8c9qh" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.135575 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8c9qh" event={"ID":"b23ff57a-7365-4543-8a58-b9df4a3e52f4","Type":"ContainerDied","Data":"23a0331a1d1745f43054e46770decda40d2b371d2f072f3074923b84644da715"} Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.136463 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a0331a1d1745f43054e46770decda40d2b371d2f072f3074923b84644da715" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.162479 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9f48b3d-c638-4337-b3f7-c599bcf7ef72","Type":"ContainerStarted","Data":"ceda4e3b87a68da048e50594e55ccf16030a4f96471fe0e9b6f01caa2486b91d"} Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.199046 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76f77cb758-wwjbp"] Mar 19 20:24:02 crc kubenswrapper[4799]: E0319 20:24:02.199501 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23ff57a-7365-4543-8a58-b9df4a3e52f4" containerName="keystone-bootstrap" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.199522 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23ff57a-7365-4543-8a58-b9df4a3e52f4" containerName="keystone-bootstrap" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.199729 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23ff57a-7365-4543-8a58-b9df4a3e52f4" containerName="keystone-bootstrap" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.200556 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.202756 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.203465 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vdqvk" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.204750 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.210815 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.211012 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.211138 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.213432 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76f77cb758-wwjbp"] Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.229858 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-credential-keys\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.229897 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-scripts\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.229957 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-combined-ca-bundle\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.229996 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzrm\" (UniqueName: \"kubernetes.io/projected/0eea0f4a-eab7-4dae-844b-f614654cd6d4-kube-api-access-qwzrm\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.230041 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-config-data\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.230078 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-public-tls-certs\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.230157 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-fernet-keys\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.232249 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-internal-tls-certs\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.232152 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-787b8d7874-lck4d"] Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.333798 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-combined-ca-bundle\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.333866 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzrm\" (UniqueName: \"kubernetes.io/projected/0eea0f4a-eab7-4dae-844b-f614654cd6d4-kube-api-access-qwzrm\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.333910 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-config-data\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.333956 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-public-tls-certs\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.333994 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-fernet-keys\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.334056 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-internal-tls-certs\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.334130 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-credential-keys\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.334151 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-scripts\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.340653 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-credential-keys\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.350998 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-combined-ca-bundle\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.356087 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-fernet-keys\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.356537 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-scripts\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.357834 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-public-tls-certs\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.361058 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzrm\" (UniqueName: \"kubernetes.io/projected/0eea0f4a-eab7-4dae-844b-f614654cd6d4-kube-api-access-qwzrm\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.372003 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-config-data\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.385872 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eea0f4a-eab7-4dae-844b-f614654cd6d4-internal-tls-certs\") pod \"keystone-76f77cb758-wwjbp\" (UID: \"0eea0f4a-eab7-4dae-844b-f614654cd6d4\") " pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:02 crc kubenswrapper[4799]: I0319 20:24:02.633094 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:03 crc kubenswrapper[4799]: I0319 20:24:03.193611 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-787b8d7874-lck4d" event={"ID":"810c2730-0702-4ff9-b62e-74c6bc564149","Type":"ContainerStarted","Data":"e1e19e8bc5fd6abf2f10ba72282b61e1fb452954fc8c1cf6a47b43995ce952ca"} Mar 19 20:24:03 crc kubenswrapper[4799]: I0319 20:24:03.195236 4799 generic.go:334] "Generic (PLEG): container finished" podID="28f8c9fc-4f87-4da0-aa76-908eb7e729d6" containerID="153ce1bd37302784eddb1b73676555827f74959125921a6a3148b8de7d0946c2" exitCode=0 Mar 19 20:24:03 crc kubenswrapper[4799]: I0319 20:24:03.195348 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-nkqck" event={"ID":"28f8c9fc-4f87-4da0-aa76-908eb7e729d6","Type":"ContainerDied","Data":"153ce1bd37302784eddb1b73676555827f74959125921a6a3148b8de7d0946c2"} Mar 19 20:24:03 crc kubenswrapper[4799]: I0319 20:24:03.197870 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9f48b3d-c638-4337-b3f7-c599bcf7ef72","Type":"ContainerStarted","Data":"268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f"} Mar 19 20:24:03 crc kubenswrapper[4799]: I0319 20:24:03.197912 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9f48b3d-c638-4337-b3f7-c599bcf7ef72","Type":"ContainerStarted","Data":"e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42"} Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.138145 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.138184 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.186827 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.190866 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.207532 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.207516512 podStartE2EDuration="4.207516512s" podCreationTimestamp="2026-03-19 20:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:03.234452401 +0000 UTC m=+1120.840405473" watchObservedRunningTime="2026-03-19 20:24:04.207516512 +0000 UTC m=+1121.813469584" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.212483 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.213174 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.984397 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:24:04 crc kubenswrapper[4799]: I0319 20:24:04.984880 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:24:05 crc kubenswrapper[4799]: I0319 20:24:05.125527 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:24:05 crc kubenswrapper[4799]: I0319 20:24:05.126059 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:24:05 crc kubenswrapper[4799]: I0319 20:24:05.427583 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:24:05 crc kubenswrapper[4799]: I0319 20:24:05.500135 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-h4sjb"] Mar 19 20:24:05 crc kubenswrapper[4799]: I0319 20:24:05.500354 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="dnsmasq-dns" containerID="cri-o://16ef29d8555bec2f8f01fda7a20bc1d2ad121f4f0a40152be30669cf8becbd43" gracePeriod=10 Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.198516 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.199012 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.264145 4799 generic.go:334] "Generic (PLEG): container finished" podID="d181492c-247b-49bc-8e40-7a531d76011d" containerID="16ef29d8555bec2f8f01fda7a20bc1d2ad121f4f0a40152be30669cf8becbd43" exitCode=0 Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.264315 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" event={"ID":"d181492c-247b-49bc-8e40-7a531d76011d","Type":"ContainerDied","Data":"16ef29d8555bec2f8f01fda7a20bc1d2ad121f4f0a40152be30669cf8becbd43"} Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.275355 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-nkqck" event={"ID":"28f8c9fc-4f87-4da0-aa76-908eb7e729d6","Type":"ContainerDied","Data":"69819e79455c3c60cff7850910f87664e4f736f0271ee61f274358906a37c755"} Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.275406 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69819e79455c3c60cff7850910f87664e4f736f0271ee61f274358906a37c755" Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.322283 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.419050 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwnq4\" (UniqueName: \"kubernetes.io/projected/28f8c9fc-4f87-4da0-aa76-908eb7e729d6-kube-api-access-wwnq4\") pod \"28f8c9fc-4f87-4da0-aa76-908eb7e729d6\" (UID: \"28f8c9fc-4f87-4da0-aa76-908eb7e729d6\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.442649 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f8c9fc-4f87-4da0-aa76-908eb7e729d6-kube-api-access-wwnq4" (OuterVolumeSpecName: "kube-api-access-wwnq4") pod "28f8c9fc-4f87-4da0-aa76-908eb7e729d6" (UID: "28f8c9fc-4f87-4da0-aa76-908eb7e729d6"). InnerVolumeSpecName "kube-api-access-wwnq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.521197 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwnq4\" (UniqueName: \"kubernetes.io/projected/28f8c9fc-4f87-4da0-aa76-908eb7e729d6-kube-api-access-wwnq4\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:06 crc kubenswrapper[4799]: W0319 20:24:06.789204 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eea0f4a_eab7_4dae_844b_f614654cd6d4.slice/crio-ebddd4a2c09a581cab0929c799dbcdc2451a3d0a5c21f611358e3999124bd090 WatchSource:0}: Error finding container ebddd4a2c09a581cab0929c799dbcdc2451a3d0a5c21f611358e3999124bd090: Status 404 returned error can't find the container with id ebddd4a2c09a581cab0929c799dbcdc2451a3d0a5c21f611358e3999124bd090 Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.794502 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76f77cb758-wwjbp"] Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.859730 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.932344 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-swift-storage-0\") pod \"d181492c-247b-49bc-8e40-7a531d76011d\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.932597 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-nb\") pod \"d181492c-247b-49bc-8e40-7a531d76011d\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.932669 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-svc\") pod \"d181492c-247b-49bc-8e40-7a531d76011d\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.932716 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-sb\") pod \"d181492c-247b-49bc-8e40-7a531d76011d\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.932747 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkq7c\" (UniqueName: \"kubernetes.io/projected/d181492c-247b-49bc-8e40-7a531d76011d-kube-api-access-lkq7c\") pod \"d181492c-247b-49bc-8e40-7a531d76011d\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.932802 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-config\") pod \"d181492c-247b-49bc-8e40-7a531d76011d\" (UID: \"d181492c-247b-49bc-8e40-7a531d76011d\") " Mar 19 20:24:06 crc kubenswrapper[4799]: I0319 20:24:06.940614 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d181492c-247b-49bc-8e40-7a531d76011d-kube-api-access-lkq7c" (OuterVolumeSpecName: "kube-api-access-lkq7c") pod "d181492c-247b-49bc-8e40-7a531d76011d" (UID: "d181492c-247b-49bc-8e40-7a531d76011d"). InnerVolumeSpecName "kube-api-access-lkq7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.019330 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-config" (OuterVolumeSpecName: "config") pod "d181492c-247b-49bc-8e40-7a531d76011d" (UID: "d181492c-247b-49bc-8e40-7a531d76011d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.036148 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkq7c\" (UniqueName: \"kubernetes.io/projected/d181492c-247b-49bc-8e40-7a531d76011d-kube-api-access-lkq7c\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.036182 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.067590 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d181492c-247b-49bc-8e40-7a531d76011d" (UID: "d181492c-247b-49bc-8e40-7a531d76011d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.068626 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d181492c-247b-49bc-8e40-7a531d76011d" (UID: "d181492c-247b-49bc-8e40-7a531d76011d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.068759 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d181492c-247b-49bc-8e40-7a531d76011d" (UID: "d181492c-247b-49bc-8e40-7a531d76011d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.098071 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d181492c-247b-49bc-8e40-7a531d76011d" (UID: "d181492c-247b-49bc-8e40-7a531d76011d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.139894 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.140167 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.140239 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.140305 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d181492c-247b-49bc-8e40-7a531d76011d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.285691 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76f77cb758-wwjbp" event={"ID":"0eea0f4a-eab7-4dae-844b-f614654cd6d4","Type":"ContainerStarted","Data":"9bfb707ef64e0b5b2d1a37629c6a86805d17d214dbf5e7d8f4b18d4db1dfd3ef"} Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.286096 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76f77cb758-wwjbp" event={"ID":"0eea0f4a-eab7-4dae-844b-f614654cd6d4","Type":"ContainerStarted","Data":"ebddd4a2c09a581cab0929c799dbcdc2451a3d0a5c21f611358e3999124bd090"} Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.286123 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.288708 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-787b8d7874-lck4d" event={"ID":"810c2730-0702-4ff9-b62e-74c6bc564149","Type":"ContainerStarted","Data":"fe50d77a9790ca16fa168bd2cd4d26f4bc0a1815e39c3aeb520bac03f38be31e"} Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.288767 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-787b8d7874-lck4d" event={"ID":"810c2730-0702-4ff9-b62e-74c6bc564149","Type":"ContainerStarted","Data":"acfebf18501c3bc5158d9fb064ae6dd5c280143954fe019afa99be84553b7cbd"} Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.289209 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.289441 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.296211 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" event={"ID":"d181492c-247b-49bc-8e40-7a531d76011d","Type":"ContainerDied","Data":"57ebba1e06020a68b7d77c79992e2d29469f4575089d8e59fd4a054d7e2aa213"} Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.296297 4799 scope.go:117] "RemoveContainer" containerID="16ef29d8555bec2f8f01fda7a20bc1d2ad121f4f0a40152be30669cf8becbd43" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.296561 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.311894 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76f77cb758-wwjbp" podStartSLOduration=5.311875379 podStartE2EDuration="5.311875379s" podCreationTimestamp="2026-03-19 20:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:07.300850248 +0000 UTC m=+1124.906803320" watchObservedRunningTime="2026-03-19 20:24:07.311875379 +0000 UTC m=+1124.917828471" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.313544 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerStarted","Data":"49b4c92fe13b31dc0792a94c48277182a6fbd8562ee07c6a68a8a8c9da9e9227"} Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.313608 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-nkqck" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.328704 4799 scope.go:117] "RemoveContainer" containerID="9dd1345a95985bd5e9ab8c472616b9969b3978cc7c77af256b7420b2660baf87" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.339124 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-787b8d7874-lck4d" podStartSLOduration=6.339097774 podStartE2EDuration="6.339097774s" podCreationTimestamp="2026-03-19 20:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:07.328301789 +0000 UTC m=+1124.934254861" watchObservedRunningTime="2026-03-19 20:24:07.339097774 +0000 UTC m=+1124.945050846" Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.363507 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-h4sjb"] Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.372844 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-h4sjb"] Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.423478 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-mjzxf"] Mar 19 20:24:07 crc kubenswrapper[4799]: I0319 20:24:07.430187 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-mjzxf"] Mar 19 20:24:08 crc kubenswrapper[4799]: I0319 20:24:08.324769 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnnwh" event={"ID":"51a34e40-cb35-4589-9fcd-20130bd7831f","Type":"ContainerStarted","Data":"b8f230c7dc9e803ce7e492826f25e0bb103a0b6ffcfc36c14c3ad57a4b6ee6b9"} Mar 19 20:24:08 crc kubenswrapper[4799]: I0319 20:24:08.344229 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jnnwh" podStartSLOduration=2.932651072 podStartE2EDuration="43.344210972s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="2026-03-19 20:23:27.28413002 +0000 UTC m=+1084.890083092" lastFinishedPulling="2026-03-19 20:24:07.69568992 +0000 UTC m=+1125.301642992" observedRunningTime="2026-03-19 20:24:08.340196842 +0000 UTC m=+1125.946149914" watchObservedRunningTime="2026-03-19 20:24:08.344210972 +0000 UTC m=+1125.950164064" Mar 19 20:24:09 crc kubenswrapper[4799]: I0319 20:24:09.138412 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e85fea-caad-40ee-b17e-e238858fa50f" path="/var/lib/kubelet/pods/69e85fea-caad-40ee-b17e-e238858fa50f/volumes" Mar 19 20:24:09 crc kubenswrapper[4799]: I0319 20:24:09.140180 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d181492c-247b-49bc-8e40-7a531d76011d" path="/var/lib/kubelet/pods/d181492c-247b-49bc-8e40-7a531d76011d/volumes" Mar 19 20:24:10 crc kubenswrapper[4799]: I0319 20:24:10.350827 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rcxnv" event={"ID":"a40c3083-66e1-4fb7-b4b4-88fa2935cb49","Type":"ContainerStarted","Data":"9eb2958c467d6f996313ad694f7fd17a3da3991d9bc2c30eeeb5e8bd600e1cb0"} Mar 19 20:24:10 crc kubenswrapper[4799]: I0319 20:24:10.369551 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rcxnv" podStartSLOduration=3.279990587 podStartE2EDuration="45.369531469s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="2026-03-19 20:23:27.679333705 +0000 UTC m=+1085.285286777" lastFinishedPulling="2026-03-19 20:24:09.768874587 +0000 UTC m=+1127.374827659" observedRunningTime="2026-03-19 20:24:10.367546725 +0000 UTC m=+1127.973499797" watchObservedRunningTime="2026-03-19 20:24:10.369531469 +0000 UTC m=+1127.975484541" Mar 19 20:24:10 crc kubenswrapper[4799]: I0319 20:24:10.515794 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:10 crc kubenswrapper[4799]: I0319 20:24:10.515879 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:10 crc kubenswrapper[4799]: I0319 20:24:10.561960 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:10 crc kubenswrapper[4799]: I0319 20:24:10.562640 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:11 crc kubenswrapper[4799]: I0319 20:24:11.361714 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:11 crc kubenswrapper[4799]: I0319 20:24:11.361775 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:11 crc kubenswrapper[4799]: I0319 20:24:11.386487 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d5dc7cf69-h4sjb" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Mar 19 20:24:13 crc kubenswrapper[4799]: I0319 20:24:13.216136 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:13 crc kubenswrapper[4799]: I0319 20:24:13.219897 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:13 crc kubenswrapper[4799]: I0319 20:24:13.384509 4799 generic.go:334] "Generic (PLEG): container finished" podID="51a34e40-cb35-4589-9fcd-20130bd7831f" containerID="b8f230c7dc9e803ce7e492826f25e0bb103a0b6ffcfc36c14c3ad57a4b6ee6b9" exitCode=0 Mar 19 20:24:13 crc kubenswrapper[4799]: I0319 20:24:13.384606 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnnwh" event={"ID":"51a34e40-cb35-4589-9fcd-20130bd7831f","Type":"ContainerDied","Data":"b8f230c7dc9e803ce7e492826f25e0bb103a0b6ffcfc36c14c3ad57a4b6ee6b9"} Mar 19 20:24:13 crc kubenswrapper[4799]: I0319 20:24:13.389295 4799 generic.go:334] "Generic (PLEG): container finished" podID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" containerID="9eb2958c467d6f996313ad694f7fd17a3da3991d9bc2c30eeeb5e8bd600e1cb0" exitCode=0 Mar 19 20:24:13 crc kubenswrapper[4799]: I0319 20:24:13.389608 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rcxnv" event={"ID":"a40c3083-66e1-4fb7-b4b4-88fa2935cb49","Type":"ContainerDied","Data":"9eb2958c467d6f996313ad694f7fd17a3da3991d9bc2c30eeeb5e8bd600e1cb0"} Mar 19 20:24:14 crc kubenswrapper[4799]: I0319 20:24:14.987897 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8869c89f8-jvpbt" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 20:24:15 crc kubenswrapper[4799]: I0319 20:24:15.128500 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56454c8868-kxl79" podUID="d9b7bec9-2633-410d-be4e-c65c9a903a38" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.176443 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.181037 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287176 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51a34e40-cb35-4589-9fcd-20130bd7831f-etc-machine-id\") pod \"51a34e40-cb35-4589-9fcd-20130bd7831f\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287222 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-combined-ca-bundle\") pod \"51a34e40-cb35-4589-9fcd-20130bd7831f\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287316 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-combined-ca-bundle\") pod \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287372 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktcs4\" (UniqueName: \"kubernetes.io/projected/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-kube-api-access-ktcs4\") pod \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287445 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-config-data\") pod \"51a34e40-cb35-4589-9fcd-20130bd7831f\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287519 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q56wz\" (UniqueName: \"kubernetes.io/projected/51a34e40-cb35-4589-9fcd-20130bd7831f-kube-api-access-q56wz\") pod \"51a34e40-cb35-4589-9fcd-20130bd7831f\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287543 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-scripts\") pod \"51a34e40-cb35-4589-9fcd-20130bd7831f\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287576 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-db-sync-config-data\") pod \"51a34e40-cb35-4589-9fcd-20130bd7831f\" (UID: \"51a34e40-cb35-4589-9fcd-20130bd7831f\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287597 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-db-sync-config-data\") pod \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\" (UID: \"a40c3083-66e1-4fb7-b4b4-88fa2935cb49\") " Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.287535 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51a34e40-cb35-4589-9fcd-20130bd7831f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51a34e40-cb35-4589-9fcd-20130bd7831f" (UID: "51a34e40-cb35-4589-9fcd-20130bd7831f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.298561 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a40c3083-66e1-4fb7-b4b4-88fa2935cb49" (UID: "a40c3083-66e1-4fb7-b4b4-88fa2935cb49"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.300823 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-kube-api-access-ktcs4" (OuterVolumeSpecName: "kube-api-access-ktcs4") pod "a40c3083-66e1-4fb7-b4b4-88fa2935cb49" (UID: "a40c3083-66e1-4fb7-b4b4-88fa2935cb49"). InnerVolumeSpecName "kube-api-access-ktcs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.311951 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-scripts" (OuterVolumeSpecName: "scripts") pod "51a34e40-cb35-4589-9fcd-20130bd7831f" (UID: "51a34e40-cb35-4589-9fcd-20130bd7831f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.312553 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a34e40-cb35-4589-9fcd-20130bd7831f-kube-api-access-q56wz" (OuterVolumeSpecName: "kube-api-access-q56wz") pod "51a34e40-cb35-4589-9fcd-20130bd7831f" (UID: "51a34e40-cb35-4589-9fcd-20130bd7831f"). InnerVolumeSpecName "kube-api-access-q56wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.329644 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "51a34e40-cb35-4589-9fcd-20130bd7831f" (UID: "51a34e40-cb35-4589-9fcd-20130bd7831f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.333312 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a40c3083-66e1-4fb7-b4b4-88fa2935cb49" (UID: "a40c3083-66e1-4fb7-b4b4-88fa2935cb49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.336455 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51a34e40-cb35-4589-9fcd-20130bd7831f" (UID: "51a34e40-cb35-4589-9fcd-20130bd7831f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.368290 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-config-data" (OuterVolumeSpecName: "config-data") pod "51a34e40-cb35-4589-9fcd-20130bd7831f" (UID: "51a34e40-cb35-4589-9fcd-20130bd7831f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389225 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktcs4\" (UniqueName: \"kubernetes.io/projected/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-kube-api-access-ktcs4\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389250 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389259 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q56wz\" (UniqueName: \"kubernetes.io/projected/51a34e40-cb35-4589-9fcd-20130bd7831f-kube-api-access-q56wz\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389269 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389278 4799 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389286 4799 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389295 4799 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51a34e40-cb35-4589-9fcd-20130bd7831f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389305 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a34e40-cb35-4589-9fcd-20130bd7831f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.389313 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40c3083-66e1-4fb7-b4b4-88fa2935cb49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.419492 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerStarted","Data":"0b0bc739ced47c2d48abcf84a82c71ca91fc8302d8ee7cba9cd74e96fd0a0aa3"} Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.419580 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-central-agent" containerID="cri-o://9d014abe28c600f1b9544ac596c5d4d8e3713e7df4e2e0f510a846ecd8b92231" gracePeriod=30 Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.419629 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="sg-core" containerID="cri-o://49b4c92fe13b31dc0792a94c48277182a6fbd8562ee07c6a68a8a8c9da9e9227" gracePeriod=30 Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.419664 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-notification-agent" containerID="cri-o://d6d021779bbd46cc8379d28b90aee35c6c7cf1ef2db3cf0d06c8f6533ed7fdc2" gracePeriod=30 Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.419662 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="proxy-httpd" containerID="cri-o://0b0bc739ced47c2d48abcf84a82c71ca91fc8302d8ee7cba9cd74e96fd0a0aa3" gracePeriod=30 Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.419711 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.424617 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rcxnv" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.424625 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rcxnv" event={"ID":"a40c3083-66e1-4fb7-b4b4-88fa2935cb49","Type":"ContainerDied","Data":"c5f98c6a80037c286f553835f815e10a625b7e3257b03c9cf63919da36c49791"} Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.425321 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f98c6a80037c286f553835f815e10a625b7e3257b03c9cf63919da36c49791" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.426650 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jnnwh" event={"ID":"51a34e40-cb35-4589-9fcd-20130bd7831f","Type":"ContainerDied","Data":"4fcb078cc059a6cb1ef0587c40c790ba66d4a6b60db195e4f46279f722c8da27"} Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.426684 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fcb078cc059a6cb1ef0587c40c790ba66d4a6b60db195e4f46279f722c8da27" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.426707 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jnnwh" Mar 19 20:24:16 crc kubenswrapper[4799]: I0319 20:24:16.443477 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.27073637 podStartE2EDuration="51.443440867s" podCreationTimestamp="2026-03-19 20:23:25 +0000 UTC" firstStartedPulling="2026-03-19 20:23:26.854935764 +0000 UTC m=+1084.460888836" lastFinishedPulling="2026-03-19 20:24:16.027640241 +0000 UTC m=+1133.633593333" observedRunningTime="2026-03-19 20:24:16.440077005 +0000 UTC m=+1134.046030077" watchObservedRunningTime="2026-03-19 20:24:16.443440867 +0000 UTC m=+1134.049393939" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.445704 4799 generic.go:334] "Generic (PLEG): container finished" podID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerID="0b0bc739ced47c2d48abcf84a82c71ca91fc8302d8ee7cba9cd74e96fd0a0aa3" exitCode=0 Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.445731 4799 generic.go:334] "Generic (PLEG): container finished" podID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerID="49b4c92fe13b31dc0792a94c48277182a6fbd8562ee07c6a68a8a8c9da9e9227" exitCode=2 Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.445739 4799 generic.go:334] "Generic (PLEG): container finished" podID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerID="9d014abe28c600f1b9544ac596c5d4d8e3713e7df4e2e0f510a846ecd8b92231" exitCode=0 Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.445757 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerDied","Data":"0b0bc739ced47c2d48abcf84a82c71ca91fc8302d8ee7cba9cd74e96fd0a0aa3"} Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.445785 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerDied","Data":"49b4c92fe13b31dc0792a94c48277182a6fbd8562ee07c6a68a8a8c9da9e9227"} Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.445796 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerDied","Data":"9d014abe28c600f1b9544ac596c5d4d8e3713e7df4e2e0f510a846ecd8b92231"} Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.616631 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-784d54cf6f-wm6ts"] Mar 19 20:24:17 crc kubenswrapper[4799]: E0319 20:24:17.617134 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f8c9fc-4f87-4da0-aa76-908eb7e729d6" containerName="oc" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617158 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f8c9fc-4f87-4da0-aa76-908eb7e729d6" containerName="oc" Mar 19 20:24:17 crc kubenswrapper[4799]: E0319 20:24:17.617172 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a34e40-cb35-4589-9fcd-20130bd7831f" containerName="cinder-db-sync" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617180 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a34e40-cb35-4589-9fcd-20130bd7831f" containerName="cinder-db-sync" Mar 19 20:24:17 crc kubenswrapper[4799]: E0319 20:24:17.617202 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="dnsmasq-dns" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617209 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="dnsmasq-dns" Mar 19 20:24:17 crc kubenswrapper[4799]: E0319 20:24:17.617220 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" containerName="barbican-db-sync" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617227 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" containerName="barbican-db-sync" Mar 19 20:24:17 crc kubenswrapper[4799]: E0319 20:24:17.617276 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="init" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617285 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="init" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617514 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f8c9fc-4f87-4da0-aa76-908eb7e729d6" containerName="oc" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617527 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" containerName="barbican-db-sync" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617571 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a34e40-cb35-4589-9fcd-20130bd7831f" containerName="cinder-db-sync" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.617583 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d181492c-247b-49bc-8e40-7a531d76011d" containerName="dnsmasq-dns" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.618485 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.631442 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.633064 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.639955 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.640209 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.640473 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.640567 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.640792 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.645335 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k2v25" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.645513 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rtfrr" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.681451 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-776dcdd75d-ljvsd"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.683270 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.687670 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.699596 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-784d54cf6f-wm6ts"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.717915 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp6fs\" (UniqueName: \"kubernetes.io/projected/c6596c03-a397-4f22-b511-86e89635a92a-kube-api-access-gp6fs\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.717962 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.717985 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718004 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-config-data\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718020 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-scripts\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718042 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-config-data-custom\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718060 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-config-data-custom\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718076 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-config-data\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718103 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4r2z\" (UniqueName: \"kubernetes.io/projected/4b710194-7925-42ab-b779-be3f32094307-kube-api-access-s4r2z\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718142 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718179 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b710194-7925-42ab-b779-be3f32094307-logs\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718197 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4tgb\" (UniqueName: \"kubernetes.io/projected/19271643-f120-41c0-aec0-69316be7c3c2-kube-api-access-t4tgb\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718215 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-combined-ca-bundle\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718250 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-combined-ca-bundle\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718275 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19271643-f120-41c0-aec0-69316be7c3c2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.718291 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6596c03-a397-4f22-b511-86e89635a92a-logs\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.728985 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.754700 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-776dcdd75d-ljvsd"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833188 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833365 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b710194-7925-42ab-b779-be3f32094307-logs\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833456 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4tgb\" (UniqueName: \"kubernetes.io/projected/19271643-f120-41c0-aec0-69316be7c3c2-kube-api-access-t4tgb\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833486 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-combined-ca-bundle\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833573 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-combined-ca-bundle\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833629 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19271643-f120-41c0-aec0-69316be7c3c2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833652 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6596c03-a397-4f22-b511-86e89635a92a-logs\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833699 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp6fs\" (UniqueName: \"kubernetes.io/projected/c6596c03-a397-4f22-b511-86e89635a92a-kube-api-access-gp6fs\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833747 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833782 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833806 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-config-data\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833826 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-scripts\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833860 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-config-data-custom\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833884 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-config-data-custom\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833905 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-config-data\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.833952 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r2z\" (UniqueName: \"kubernetes.io/projected/4b710194-7925-42ab-b779-be3f32094307-kube-api-access-s4r2z\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.834678 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b710194-7925-42ab-b779-be3f32094307-logs\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.840914 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6596c03-a397-4f22-b511-86e89635a92a-logs\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.843512 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19271643-f120-41c0-aec0-69316be7c3c2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.864558 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.868524 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-config-data-custom\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.868971 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-combined-ca-bundle\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.869243 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.869321 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b710194-7925-42ab-b779-be3f32094307-config-data\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.870029 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-scripts\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.870293 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.870391 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b68d5b88c-8bgcn"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.870803 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-config-data-custom\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.871921 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.874394 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-config-data\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.881825 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6596c03-a397-4f22-b511-86e89635a92a-combined-ca-bundle\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.892987 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r2z\" (UniqueName: \"kubernetes.io/projected/4b710194-7925-42ab-b779-be3f32094307-kube-api-access-s4r2z\") pod \"barbican-keystone-listener-776dcdd75d-ljvsd\" (UID: \"4b710194-7925-42ab-b779-be3f32094307\") " pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.895003 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b68d5b88c-8bgcn"] Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.929717 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4tgb\" (UniqueName: \"kubernetes.io/projected/19271643-f120-41c0-aec0-69316be7c3c2-kube-api-access-t4tgb\") pod \"cinder-scheduler-0\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940364 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940422 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmzp\" (UniqueName: \"kubernetes.io/projected/9c750049-2704-43ac-90c1-6397d59e80d8-kube-api-access-ldmzp\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940482 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940531 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-swift-storage-0\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940576 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-config\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940602 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-svc\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.940995 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp6fs\" (UniqueName: \"kubernetes.io/projected/c6596c03-a397-4f22-b511-86e89635a92a-kube-api-access-gp6fs\") pod \"barbican-worker-784d54cf6f-wm6ts\" (UID: \"c6596c03-a397-4f22-b511-86e89635a92a\") " pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.942962 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-784d54cf6f-wm6ts" Mar 19 20:24:17 crc kubenswrapper[4799]: I0319 20:24:17.962194 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.003712 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b68d5b88c-8bgcn"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.005769 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.043276 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.043361 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-swift-storage-0\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.043429 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-config\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.043454 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-svc\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.043493 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.043513 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmzp\" (UniqueName: \"kubernetes.io/projected/9c750049-2704-43ac-90c1-6397d59e80d8-kube-api-access-ldmzp\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.044654 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.045320 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-swift-storage-0\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.045881 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.046285 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-svc\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.054478 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-config\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.064015 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.065941 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.071481 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.079914 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cbfdb8596-sd77z"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.081419 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.085468 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.093984 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94c999df7-9gfg8"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.095438 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.100689 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.107286 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-9gfg8"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.109160 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmzp\" (UniqueName: \"kubernetes.io/projected/9c750049-2704-43ac-90c1-6397d59e80d8-kube-api-access-ldmzp\") pod \"dnsmasq-dns-7b68d5b88c-8bgcn\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.116217 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbfdb8596-sd77z"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.144502 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147503 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147583 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/68702358-3d14-4012-9f8d-cecd4517ced7-kube-api-access-42bnj\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147621 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-nb\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147658 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc302d7-a52f-43bd-9d88-06fa20300a1e-logs\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147683 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147720 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147752 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whxld\" (UniqueName: \"kubernetes.io/projected/88950a1d-04b9-47f5-b45e-a30afb906eaa-kube-api-access-whxld\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147774 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-config\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147793 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-combined-ca-bundle\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147850 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-sb\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147869 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88950a1d-04b9-47f5-b45e-a30afb906eaa-logs\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.147941 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data-custom\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.148518 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-swift-storage-0\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.148555 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-scripts\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.148584 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgzl\" (UniqueName: \"kubernetes.io/projected/2fc302d7-a52f-43bd-9d88-06fa20300a1e-kube-api-access-vxgzl\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.148610 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88950a1d-04b9-47f5-b45e-a30afb906eaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.148644 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.148706 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-svc\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.259635 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-sb\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.259995 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88950a1d-04b9-47f5-b45e-a30afb906eaa-logs\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260017 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data-custom\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260051 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-swift-storage-0\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260068 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-scripts\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260091 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgzl\" (UniqueName: \"kubernetes.io/projected/2fc302d7-a52f-43bd-9d88-06fa20300a1e-kube-api-access-vxgzl\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260114 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88950a1d-04b9-47f5-b45e-a30afb906eaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260141 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260158 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-svc\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260173 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260191 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/68702358-3d14-4012-9f8d-cecd4517ced7-kube-api-access-42bnj\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260211 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-nb\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260247 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc302d7-a52f-43bd-9d88-06fa20300a1e-logs\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260267 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260298 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260340 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whxld\" (UniqueName: \"kubernetes.io/projected/88950a1d-04b9-47f5-b45e-a30afb906eaa-kube-api-access-whxld\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260364 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-config\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.260394 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-combined-ca-bundle\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.263302 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc302d7-a52f-43bd-9d88-06fa20300a1e-logs\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.264011 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-svc\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.264615 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-sb\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.264894 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88950a1d-04b9-47f5-b45e-a30afb906eaa-logs\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.264915 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-nb\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.265632 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-combined-ca-bundle\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.266670 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-swift-storage-0\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.267677 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88950a1d-04b9-47f5-b45e-a30afb906eaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.269726 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-config\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.271556 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.273093 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.273211 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.275787 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-scripts\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.279524 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.281395 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/68702358-3d14-4012-9f8d-cecd4517ced7-kube-api-access-42bnj\") pod \"dnsmasq-dns-94c999df7-9gfg8\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.284278 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data-custom\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.285347 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whxld\" (UniqueName: \"kubernetes.io/projected/88950a1d-04b9-47f5-b45e-a30afb906eaa-kube-api-access-whxld\") pod \"cinder-api-0\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.288576 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgzl\" (UniqueName: \"kubernetes.io/projected/2fc302d7-a52f-43bd-9d88-06fa20300a1e-kube-api-access-vxgzl\") pod \"barbican-api-7cbfdb8596-sd77z\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.482241 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.520581 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.529607 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.660795 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-784d54cf6f-wm6ts"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.685490 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b68d5b88c-8bgcn"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.697262 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-776dcdd75d-ljvsd"] Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.711462 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:18 crc kubenswrapper[4799]: W0319 20:24:18.723830 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6596c03_a397_4f22_b511_86e89635a92a.slice/crio-57b75542e07dc13caca94991d1017c8d0796ef857df41ebb177de1d045903fe8 WatchSource:0}: Error finding container 57b75542e07dc13caca94991d1017c8d0796ef857df41ebb177de1d045903fe8: Status 404 returned error can't find the container with id 57b75542e07dc13caca94991d1017c8d0796ef857df41ebb177de1d045903fe8 Mar 19 20:24:18 crc kubenswrapper[4799]: I0319 20:24:18.846754 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:18 crc kubenswrapper[4799]: W0319 20:24:18.856593 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88950a1d_04b9_47f5_b45e_a30afb906eaa.slice/crio-a70adf210d3e1b385b9d45312a05b3aee3670c2578452fa40c9f4fa2ae9e26e6 WatchSource:0}: Error finding container a70adf210d3e1b385b9d45312a05b3aee3670c2578452fa40c9f4fa2ae9e26e6: Status 404 returned error can't find the container with id a70adf210d3e1b385b9d45312a05b3aee3670c2578452fa40c9f4fa2ae9e26e6 Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.090168 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbfdb8596-sd77z"] Mar 19 20:24:19 crc kubenswrapper[4799]: W0319 20:24:19.106605 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fc302d7_a52f_43bd_9d88_06fa20300a1e.slice/crio-6db2279c43c56f35e2425be4145a34964ee074316edfb912040f9cb9ab50eaa0 WatchSource:0}: Error finding container 6db2279c43c56f35e2425be4145a34964ee074316edfb912040f9cb9ab50eaa0: Status 404 returned error can't find the container with id 6db2279c43c56f35e2425be4145a34964ee074316edfb912040f9cb9ab50eaa0 Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.241505 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-9gfg8"] Mar 19 20:24:19 crc kubenswrapper[4799]: W0319 20:24:19.245110 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68702358_3d14_4012_9f8d_cecd4517ced7.slice/crio-701474f48885379aa28bf4a970eec2c1f339224e976ab6051e9d619863f3c05d WatchSource:0}: Error finding container 701474f48885379aa28bf4a970eec2c1f339224e976ab6051e9d619863f3c05d: Status 404 returned error can't find the container with id 701474f48885379aa28bf4a970eec2c1f339224e976ab6051e9d619863f3c05d Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.475282 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" event={"ID":"4b710194-7925-42ab-b779-be3f32094307","Type":"ContainerStarted","Data":"ac89cad7ce920fcd6ac4d32498969a8508b6a6f7e1a4bcab3cd2af71c675c12e"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.481670 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbfdb8596-sd77z" event={"ID":"2fc302d7-a52f-43bd-9d88-06fa20300a1e","Type":"ContainerStarted","Data":"c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.481735 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbfdb8596-sd77z" event={"ID":"2fc302d7-a52f-43bd-9d88-06fa20300a1e","Type":"ContainerStarted","Data":"6db2279c43c56f35e2425be4145a34964ee074316edfb912040f9cb9ab50eaa0"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.493718 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" event={"ID":"68702358-3d14-4012-9f8d-cecd4517ced7","Type":"ContainerStarted","Data":"691d8ab6cf09593e55fec25314b3e29bc313d928f046f80a539491400ca9802e"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.493835 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" event={"ID":"68702358-3d14-4012-9f8d-cecd4517ced7","Type":"ContainerStarted","Data":"701474f48885379aa28bf4a970eec2c1f339224e976ab6051e9d619863f3c05d"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.503785 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88950a1d-04b9-47f5-b45e-a30afb906eaa","Type":"ContainerStarted","Data":"a70adf210d3e1b385b9d45312a05b3aee3670c2578452fa40c9f4fa2ae9e26e6"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.509150 4799 generic.go:334] "Generic (PLEG): container finished" podID="9c750049-2704-43ac-90c1-6397d59e80d8" containerID="f236ba283e34b043021652935d3c1bcaae345475d204ce31431fc641d59f8439" exitCode=0 Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.509287 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" event={"ID":"9c750049-2704-43ac-90c1-6397d59e80d8","Type":"ContainerDied","Data":"f236ba283e34b043021652935d3c1bcaae345475d204ce31431fc641d59f8439"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.509334 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" event={"ID":"9c750049-2704-43ac-90c1-6397d59e80d8","Type":"ContainerStarted","Data":"073a42ab52e74aae84412a2c469332e118750bd45e62a9a89a5102dd1b1648ff"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.523554 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19271643-f120-41c0-aec0-69316be7c3c2","Type":"ContainerStarted","Data":"bdcffdc18367be84fd6e845b099606d5bae26e497ce9643471c767807f155c58"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.525746 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-784d54cf6f-wm6ts" event={"ID":"c6596c03-a397-4f22-b511-86e89635a92a","Type":"ContainerStarted","Data":"57b75542e07dc13caca94991d1017c8d0796ef857df41ebb177de1d045903fe8"} Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.808413 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.904984 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-config\") pod \"9c750049-2704-43ac-90c1-6397d59e80d8\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.905048 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-svc\") pod \"9c750049-2704-43ac-90c1-6397d59e80d8\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.905075 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-swift-storage-0\") pod \"9c750049-2704-43ac-90c1-6397d59e80d8\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.905183 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-nb\") pod \"9c750049-2704-43ac-90c1-6397d59e80d8\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.905209 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-sb\") pod \"9c750049-2704-43ac-90c1-6397d59e80d8\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.905289 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldmzp\" (UniqueName: \"kubernetes.io/projected/9c750049-2704-43ac-90c1-6397d59e80d8-kube-api-access-ldmzp\") pod \"9c750049-2704-43ac-90c1-6397d59e80d8\" (UID: \"9c750049-2704-43ac-90c1-6397d59e80d8\") " Mar 19 20:24:19 crc kubenswrapper[4799]: I0319 20:24:19.921588 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c750049-2704-43ac-90c1-6397d59e80d8-kube-api-access-ldmzp" (OuterVolumeSpecName: "kube-api-access-ldmzp") pod "9c750049-2704-43ac-90c1-6397d59e80d8" (UID: "9c750049-2704-43ac-90c1-6397d59e80d8"). InnerVolumeSpecName "kube-api-access-ldmzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.007337 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldmzp\" (UniqueName: \"kubernetes.io/projected/9c750049-2704-43ac-90c1-6397d59e80d8-kube-api-access-ldmzp\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.012780 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c750049-2704-43ac-90c1-6397d59e80d8" (UID: "9c750049-2704-43ac-90c1-6397d59e80d8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.019938 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c750049-2704-43ac-90c1-6397d59e80d8" (UID: "9c750049-2704-43ac-90c1-6397d59e80d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.020416 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-config" (OuterVolumeSpecName: "config") pod "9c750049-2704-43ac-90c1-6397d59e80d8" (UID: "9c750049-2704-43ac-90c1-6397d59e80d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.038988 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c750049-2704-43ac-90c1-6397d59e80d8" (UID: "9c750049-2704-43ac-90c1-6397d59e80d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.091333 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c750049-2704-43ac-90c1-6397d59e80d8" (UID: "9c750049-2704-43ac-90c1-6397d59e80d8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.126479 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.126513 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.126522 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.126531 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.126541 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c750049-2704-43ac-90c1-6397d59e80d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.236757 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.539411 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88950a1d-04b9-47f5-b45e-a30afb906eaa","Type":"ContainerStarted","Data":"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae"} Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.541560 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" event={"ID":"9c750049-2704-43ac-90c1-6397d59e80d8","Type":"ContainerDied","Data":"073a42ab52e74aae84412a2c469332e118750bd45e62a9a89a5102dd1b1648ff"} Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.541612 4799 scope.go:117] "RemoveContainer" containerID="f236ba283e34b043021652935d3c1bcaae345475d204ce31431fc641d59f8439" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.541725 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b68d5b88c-8bgcn" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.559595 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbfdb8596-sd77z" event={"ID":"2fc302d7-a52f-43bd-9d88-06fa20300a1e","Type":"ContainerStarted","Data":"331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca"} Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.559920 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.559942 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.562547 4799 generic.go:334] "Generic (PLEG): container finished" podID="68702358-3d14-4012-9f8d-cecd4517ced7" containerID="691d8ab6cf09593e55fec25314b3e29bc313d928f046f80a539491400ca9802e" exitCode=0 Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.562577 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" event={"ID":"68702358-3d14-4012-9f8d-cecd4517ced7","Type":"ContainerDied","Data":"691d8ab6cf09593e55fec25314b3e29bc313d928f046f80a539491400ca9802e"} Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.562594 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" event={"ID":"68702358-3d14-4012-9f8d-cecd4517ced7","Type":"ContainerStarted","Data":"a44643ffc07120d1e32ea8fd0ade77e4f2ce18b92b138b99a379874b44443eca"} Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.562974 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.575709 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cbfdb8596-sd77z" podStartSLOduration=2.575691995 podStartE2EDuration="2.575691995s" podCreationTimestamp="2026-03-19 20:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:20.574884773 +0000 UTC m=+1138.180837845" watchObservedRunningTime="2026-03-19 20:24:20.575691995 +0000 UTC m=+1138.181645067" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.598162 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" podStartSLOduration=2.598140909 podStartE2EDuration="2.598140909s" podCreationTimestamp="2026-03-19 20:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:20.594320755 +0000 UTC m=+1138.200273827" watchObservedRunningTime="2026-03-19 20:24:20.598140909 +0000 UTC m=+1138.204093981" Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.652034 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b68d5b88c-8bgcn"] Mar 19 20:24:20 crc kubenswrapper[4799]: I0319 20:24:20.666678 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b68d5b88c-8bgcn"] Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.127608 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c750049-2704-43ac-90c1-6397d59e80d8" path="/var/lib/kubelet/pods/9c750049-2704-43ac-90c1-6397d59e80d8/volumes" Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.724672 4799 generic.go:334] "Generic (PLEG): container finished" podID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerID="d6d021779bbd46cc8379d28b90aee35c6c7cf1ef2db3cf0d06c8f6533ed7fdc2" exitCode=0 Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.724830 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerDied","Data":"d6d021779bbd46cc8379d28b90aee35c6c7cf1ef2db3cf0d06c8f6533ed7fdc2"} Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.726983 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88950a1d-04b9-47f5-b45e-a30afb906eaa","Type":"ContainerStarted","Data":"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2"} Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.727120 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api-log" containerID="cri-o://d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae" gracePeriod=30 Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.727267 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api" containerID="cri-o://337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2" gracePeriod=30 Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.727463 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.737525 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19271643-f120-41c0-aec0-69316be7c3c2","Type":"ContainerStarted","Data":"ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189"} Mar 19 20:24:21 crc kubenswrapper[4799]: I0319 20:24:21.768231 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.7682135690000003 podStartE2EDuration="3.768213569s" podCreationTimestamp="2026-03-19 20:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:21.747661606 +0000 UTC m=+1139.353614688" watchObservedRunningTime="2026-03-19 20:24:21.768213569 +0000 UTC m=+1139.374166641" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.259028 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.404961 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.422862 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-log-httpd\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.423128 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-combined-ca-bundle\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.423172 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-config-data\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.423310 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qklsf\" (UniqueName: \"kubernetes.io/projected/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-kube-api-access-qklsf\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.423341 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-run-httpd\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.423405 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-scripts\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.423498 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-sg-core-conf-yaml\") pod \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\" (UID: \"6e306dff-f3ce-48df-b9d4-aef952f3e0a5\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.424294 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.425079 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.428015 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-scripts" (OuterVolumeSpecName: "scripts") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.429053 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-kube-api-access-qklsf" (OuterVolumeSpecName: "kube-api-access-qklsf") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "kube-api-access-qklsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.471322 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.524806 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88950a1d-04b9-47f5-b45e-a30afb906eaa-logs\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.524896 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data-custom\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.524952 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88950a1d-04b9-47f5-b45e-a30afb906eaa-etc-machine-id\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.525003 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whxld\" (UniqueName: \"kubernetes.io/projected/88950a1d-04b9-47f5-b45e-a30afb906eaa-kube-api-access-whxld\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.525056 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88950a1d-04b9-47f5-b45e-a30afb906eaa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.525345 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88950a1d-04b9-47f5-b45e-a30afb906eaa-logs" (OuterVolumeSpecName: "logs") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.525544 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-combined-ca-bundle\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.525661 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-scripts\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.525751 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data\") pod \"88950a1d-04b9-47f5-b45e-a30afb906eaa\" (UID: \"88950a1d-04b9-47f5-b45e-a30afb906eaa\") " Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526260 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526284 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526298 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88950a1d-04b9-47f5-b45e-a30afb906eaa-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526310 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qklsf\" (UniqueName: \"kubernetes.io/projected/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-kube-api-access-qklsf\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526325 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526338 4799 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88950a1d-04b9-47f5-b45e-a30afb906eaa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.526350 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.528364 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88950a1d-04b9-47f5-b45e-a30afb906eaa-kube-api-access-whxld" (OuterVolumeSpecName: "kube-api-access-whxld") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "kube-api-access-whxld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.531016 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-scripts" (OuterVolumeSpecName: "scripts") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.531829 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.627455 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.627485 4799 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.627496 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whxld\" (UniqueName: \"kubernetes.io/projected/88950a1d-04b9-47f5-b45e-a30afb906eaa-kube-api-access-whxld\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.641788 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.642207 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.655511 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-config-data" (OuterVolumeSpecName: "config-data") pod "6e306dff-f3ce-48df-b9d4-aef952f3e0a5" (UID: "6e306dff-f3ce-48df-b9d4-aef952f3e0a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.672106 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data" (OuterVolumeSpecName: "config-data") pod "88950a1d-04b9-47f5-b45e-a30afb906eaa" (UID: "88950a1d-04b9-47f5-b45e-a30afb906eaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.729353 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.729457 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.729469 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e306dff-f3ce-48df-b9d4-aef952f3e0a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.729479 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88950a1d-04b9-47f5-b45e-a30afb906eaa-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746205 4799 generic.go:334] "Generic (PLEG): container finished" podID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerID="337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2" exitCode=0 Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746237 4799 generic.go:334] "Generic (PLEG): container finished" podID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerID="d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae" exitCode=143 Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746274 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746286 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88950a1d-04b9-47f5-b45e-a30afb906eaa","Type":"ContainerDied","Data":"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746571 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88950a1d-04b9-47f5-b45e-a30afb906eaa","Type":"ContainerDied","Data":"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746604 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88950a1d-04b9-47f5-b45e-a30afb906eaa","Type":"ContainerDied","Data":"a70adf210d3e1b385b9d45312a05b3aee3670c2578452fa40c9f4fa2ae9e26e6"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.746622 4799 scope.go:117] "RemoveContainer" containerID="337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.749536 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19271643-f120-41c0-aec0-69316be7c3c2","Type":"ContainerStarted","Data":"faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.751525 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-784d54cf6f-wm6ts" event={"ID":"c6596c03-a397-4f22-b511-86e89635a92a","Type":"ContainerStarted","Data":"6ccc262da12678da277df18a3d0161a8b8025208b5b43659d7de860d74706f78"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.751565 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-784d54cf6f-wm6ts" event={"ID":"c6596c03-a397-4f22-b511-86e89635a92a","Type":"ContainerStarted","Data":"de131699c0c1035fbed0ef471a0c9f9f462af84bb57d20c6fb5a12ebc9fd3f4c"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.753024 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" event={"ID":"4b710194-7925-42ab-b779-be3f32094307","Type":"ContainerStarted","Data":"a4a8501b5839fbcbb0296aac1aa5b6676044642ad8667034f87d50c2a8aad74e"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.753061 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" event={"ID":"4b710194-7925-42ab-b779-be3f32094307","Type":"ContainerStarted","Data":"a03242b2953514e1b901d94a8b27830b61facd6c2200a1dfb2c9663297449de4"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.763435 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e306dff-f3ce-48df-b9d4-aef952f3e0a5","Type":"ContainerDied","Data":"233d9264946f1281ac764a6bb708a88426e707a71c39f707e965f381c77eee1b"} Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.763584 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.776516 4799 scope.go:117] "RemoveContainer" containerID="d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.780612 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.803048402 podStartE2EDuration="5.780589625s" podCreationTimestamp="2026-03-19 20:24:17 +0000 UTC" firstStartedPulling="2026-03-19 20:24:18.700759861 +0000 UTC m=+1136.306712943" lastFinishedPulling="2026-03-19 20:24:19.678301104 +0000 UTC m=+1137.284254166" observedRunningTime="2026-03-19 20:24:22.772747981 +0000 UTC m=+1140.378701053" watchObservedRunningTime="2026-03-19 20:24:22.780589625 +0000 UTC m=+1140.386542707" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.818746 4799 scope.go:117] "RemoveContainer" containerID="337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.838736 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2\": container with ID starting with 337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2 not found: ID does not exist" containerID="337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.838778 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2"} err="failed to get container status \"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2\": rpc error: code = NotFound desc = could not find container \"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2\": container with ID starting with 337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2 not found: ID does not exist" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.838801 4799 scope.go:117] "RemoveContainer" containerID="d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.843769 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae\": container with ID starting with d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae not found: ID does not exist" containerID="d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.843814 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae"} err="failed to get container status \"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae\": rpc error: code = NotFound desc = could not find container \"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae\": container with ID starting with d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae not found: ID does not exist" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.843840 4799 scope.go:117] "RemoveContainer" containerID="337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.844083 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2"} err="failed to get container status \"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2\": rpc error: code = NotFound desc = could not find container \"337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2\": container with ID starting with 337a72d54294416bb868349b624d7fc27e260f17b0334a7607ee134eee51b2d2 not found: ID does not exist" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.844102 4799 scope.go:117] "RemoveContainer" containerID="d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.844286 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae"} err="failed to get container status \"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae\": rpc error: code = NotFound desc = could not find container \"d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae\": container with ID starting with d1a5eeca935719c00109246be8bc477a219b6212839706c20350a196515087ae not found: ID does not exist" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.844304 4799 scope.go:117] "RemoveContainer" containerID="0b0bc739ced47c2d48abcf84a82c71ca91fc8302d8ee7cba9cd74e96fd0a0aa3" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.845664 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-784d54cf6f-wm6ts" podStartSLOduration=2.7461132790000002 podStartE2EDuration="5.845641395s" podCreationTimestamp="2026-03-19 20:24:17 +0000 UTC" firstStartedPulling="2026-03-19 20:24:18.758746707 +0000 UTC m=+1136.364699779" lastFinishedPulling="2026-03-19 20:24:21.858274813 +0000 UTC m=+1139.464227895" observedRunningTime="2026-03-19 20:24:22.810593966 +0000 UTC m=+1140.416547038" watchObservedRunningTime="2026-03-19 20:24:22.845641395 +0000 UTC m=+1140.451594467" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.866810 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-776dcdd75d-ljvsd" podStartSLOduration=2.796196419 podStartE2EDuration="5.866787873s" podCreationTimestamp="2026-03-19 20:24:17 +0000 UTC" firstStartedPulling="2026-03-19 20:24:18.762110199 +0000 UTC m=+1136.368063271" lastFinishedPulling="2026-03-19 20:24:21.832701653 +0000 UTC m=+1139.438654725" observedRunningTime="2026-03-19 20:24:22.839157417 +0000 UTC m=+1140.445110489" watchObservedRunningTime="2026-03-19 20:24:22.866787873 +0000 UTC m=+1140.472740945" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.880585 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.902568 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.919463 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.919552 4799 scope.go:117] "RemoveContainer" containerID="49b4c92fe13b31dc0792a94c48277182a6fbd8562ee07c6a68a8a8c9da9e9227" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.957585 4799 scope.go:117] "RemoveContainer" containerID="d6d021779bbd46cc8379d28b90aee35c6c7cf1ef2db3cf0d06c8f6533ed7fdc2" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.963272 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.975446 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.981483 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.981889 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="proxy-httpd" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.981905 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="proxy-httpd" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.981918 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.981925 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.981943 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-notification-agent" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.981949 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-notification-agent" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.981959 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api-log" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.981964 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api-log" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.981987 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-central-agent" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.981993 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-central-agent" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.982002 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="sg-core" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982008 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="sg-core" Mar 19 20:24:22 crc kubenswrapper[4799]: E0319 20:24:22.982024 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c750049-2704-43ac-90c1-6397d59e80d8" containerName="init" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982029 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c750049-2704-43ac-90c1-6397d59e80d8" containerName="init" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982193 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-central-agent" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982207 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982218 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="proxy-httpd" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982226 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="sg-core" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982236 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" containerName="cinder-api-log" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982243 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" containerName="ceilometer-notification-agent" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.982257 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c750049-2704-43ac-90c1-6397d59e80d8" containerName="init" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.983231 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.987189 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.987598 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.993661 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:22 crc kubenswrapper[4799]: I0319 20:24:22.995547 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:22.999083 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.001227 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.001513 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.001639 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.013727 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044519 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01d335a7-09e5-4073-bf70-5ac03807ff12-etc-machine-id\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044572 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjbq\" (UniqueName: \"kubernetes.io/projected/01d335a7-09e5-4073-bf70-5ac03807ff12-kube-api-access-psjbq\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044597 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044622 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d335a7-09e5-4073-bf70-5ac03807ff12-logs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044648 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-scripts\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044668 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-scripts\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044686 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044704 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044718 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044747 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-config-data-custom\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044764 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5s9b\" (UniqueName: \"kubernetes.io/projected/6d3a79a1-4358-46e0-b444-151927715b1a-kube-api-access-f5s9b\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044798 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-public-tls-certs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044816 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-config-data\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044839 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-config-data\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044862 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-run-httpd\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.044895 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-log-httpd\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.068564 4799 scope.go:117] "RemoveContainer" containerID="9d014abe28c600f1b9544ac596c5d4d8e3713e7df4e2e0f510a846ecd8b92231" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.147793 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01d335a7-09e5-4073-bf70-5ac03807ff12-etc-machine-id\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.147883 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjbq\" (UniqueName: \"kubernetes.io/projected/01d335a7-09e5-4073-bf70-5ac03807ff12-kube-api-access-psjbq\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.147926 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.147930 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01d335a7-09e5-4073-bf70-5ac03807ff12-etc-machine-id\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.147956 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d335a7-09e5-4073-bf70-5ac03807ff12-logs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148080 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-scripts\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148123 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-scripts\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148182 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148245 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148273 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148368 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01d335a7-09e5-4073-bf70-5ac03807ff12-logs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148777 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-config-data-custom\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148852 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5s9b\" (UniqueName: \"kubernetes.io/projected/6d3a79a1-4358-46e0-b444-151927715b1a-kube-api-access-f5s9b\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148956 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-public-tls-certs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.148999 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-config-data\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.149060 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-config-data\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.149109 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-run-httpd\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.149176 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-log-httpd\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.153249 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.153421 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.153789 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-log-httpd\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.154229 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.154352 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.154598 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-run-httpd\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.154805 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.155538 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.162853 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-scripts\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.164421 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.164725 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-config-data\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.165213 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-config-data-custom\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.165448 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e306dff-f3ce-48df-b9d4-aef952f3e0a5" path="/var/lib/kubelet/pods/6e306dff-f3ce-48df-b9d4-aef952f3e0a5/volumes" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.169657 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.170465 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjbq\" (UniqueName: \"kubernetes.io/projected/01d335a7-09e5-4073-bf70-5ac03807ff12-kube-api-access-psjbq\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.173539 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88950a1d-04b9-47f5-b45e-a30afb906eaa" path="/var/lib/kubelet/pods/88950a1d-04b9-47f5-b45e-a30afb906eaa/volumes" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.175097 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-config-data\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.177687 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.185411 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-public-tls-certs\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.195325 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d335a7-09e5-4073-bf70-5ac03807ff12-scripts\") pod \"cinder-api-0\" (UID: \"01d335a7-09e5-4073-bf70-5ac03807ff12\") " pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.201104 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5s9b\" (UniqueName: \"kubernetes.io/projected/6d3a79a1-4358-46e0-b444-151927715b1a-kube-api-access-f5s9b\") pod \"ceilometer-0\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.370526 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.408374 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:23 crc kubenswrapper[4799]: I0319 20:24:23.929819 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.046837 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.492404 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-868b778b64-pzgfd"] Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.494824 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.497081 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.505685 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.519971 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-868b778b64-pzgfd"] Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.580802 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-internal-tls-certs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.580853 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-config-data\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.580893 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-combined-ca-bundle\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.580917 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-logs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.580936 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-config-data-custom\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.580975 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-public-tls-certs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.581007 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4n96\" (UniqueName: \"kubernetes.io/projected/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-kube-api-access-s4n96\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683049 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-internal-tls-certs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683098 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-config-data\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683137 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-combined-ca-bundle\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683154 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-logs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683170 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-config-data-custom\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683206 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-public-tls-certs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.683235 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4n96\" (UniqueName: \"kubernetes.io/projected/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-kube-api-access-s4n96\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.684119 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-logs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.687276 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-combined-ca-bundle\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.690832 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-config-data-custom\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.695614 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-public-tls-certs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.696065 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-internal-tls-certs\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.698123 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-config-data\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.698642 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4n96\" (UniqueName: \"kubernetes.io/projected/c6e6e053-3361-4d22-9ef7-fd7e96b77cf4-kube-api-access-s4n96\") pod \"barbican-api-868b778b64-pzgfd\" (UID: \"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4\") " pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.806428 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerStarted","Data":"8ca9c629404a820204009b08e62eb4e9eb40372475e797d31cb6065bfc3714d6"} Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.806525 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerStarted","Data":"eefe6dd1623541a090d1079ccf82bf7ed8cdc9b7f3f4855ff3e6ef65530bc16f"} Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.809778 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"01d335a7-09e5-4073-bf70-5ac03807ff12","Type":"ContainerStarted","Data":"f660d9a990fb59c209162d9238630e8f21e83aec87cc58c9453ec2b8fe0025ff"} Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.809800 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"01d335a7-09e5-4073-bf70-5ac03807ff12","Type":"ContainerStarted","Data":"b7f00edda7139cea68b7d536c5a9c11f57fa681235fabab6be14a6f8075b65c8"} Mar 19 20:24:24 crc kubenswrapper[4799]: I0319 20:24:24.812322 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.301694 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.343038 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-868b778b64-pzgfd"] Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.580471 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79858f4d8f-lj6r6"] Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.581064 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79858f4d8f-lj6r6" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-api" containerID="cri-o://3477927a0ef32b69a01e92568fc7afdb648c24c81e3a7b27ab7bdf39cd3e8b8d" gracePeriod=30 Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.581631 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79858f4d8f-lj6r6" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-httpd" containerID="cri-o://88c9011868828ff91e6dc46d7ff538d531a509ae0fe0fd8b03f81917a3ba7333" gracePeriod=30 Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.596816 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79858f4d8f-lj6r6" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": EOF" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.612443 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77768f5c85-6lgxw"] Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.618783 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.669683 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77768f5c85-6lgxw"] Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681085 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-public-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681121 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8m2n\" (UniqueName: \"kubernetes.io/projected/f2d71d6b-73c6-4edf-8ef2-5295c628603c-kube-api-access-t8m2n\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681150 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-httpd-config\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681172 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-combined-ca-bundle\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681194 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-ovndb-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681251 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-config\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.681303 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-internal-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785018 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-internal-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785124 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-public-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785146 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8m2n\" (UniqueName: \"kubernetes.io/projected/f2d71d6b-73c6-4edf-8ef2-5295c628603c-kube-api-access-t8m2n\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785172 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-httpd-config\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785194 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-combined-ca-bundle\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785217 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-ovndb-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.785270 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-config\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.805718 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-config\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.808241 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-public-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.809499 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-internal-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.810121 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-combined-ca-bundle\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.813584 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-httpd-config\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.816968 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d71d6b-73c6-4edf-8ef2-5295c628603c-ovndb-tls-certs\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.862395 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"01d335a7-09e5-4073-bf70-5ac03807ff12","Type":"ContainerStarted","Data":"5f9d33a5fe93fbfcf1924f83b2dbee4900d11be11df8eb087b34e4e1eda85f68"} Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.862532 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8m2n\" (UniqueName: \"kubernetes.io/projected/f2d71d6b-73c6-4edf-8ef2-5295c628603c-kube-api-access-t8m2n\") pod \"neutron-77768f5c85-6lgxw\" (UID: \"f2d71d6b-73c6-4edf-8ef2-5295c628603c\") " pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.862596 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.875643 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerStarted","Data":"6464cad4f621fcf55e5ebb42382111dea71d0e723843e30024d6bdd184f20043"} Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.889376 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-868b778b64-pzgfd" event={"ID":"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4","Type":"ContainerStarted","Data":"7f524719d1b651126825d9f1f533cffdd37f640dc858bcee693e022bc2837597"} Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.889436 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-868b778b64-pzgfd" event={"ID":"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4","Type":"ContainerStarted","Data":"920554a8cc5e98e86aeac5c3929e7e25d5c86e48f914df9659a14d3364c43250"} Mar 19 20:24:25 crc kubenswrapper[4799]: I0319 20:24:25.917133 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.917116172 podStartE2EDuration="3.917116172s" podCreationTimestamp="2026-03-19 20:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:25.896827827 +0000 UTC m=+1143.502780889" watchObservedRunningTime="2026-03-19 20:24:25.917116172 +0000 UTC m=+1143.523069244" Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.021921 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.563649 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7cbfdb8596-sd77z" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.637166 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77768f5c85-6lgxw"] Mar 19 20:24:26 crc kubenswrapper[4799]: W0319 20:24:26.659071 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d71d6b_73c6_4edf_8ef2_5295c628603c.slice/crio-64177f4d6bb3a37d639b7e85038c22dfbd2d0e2137b0c6691c4a2cc89d5c50bb WatchSource:0}: Error finding container 64177f4d6bb3a37d639b7e85038c22dfbd2d0e2137b0c6691c4a2cc89d5c50bb: Status 404 returned error can't find the container with id 64177f4d6bb3a37d639b7e85038c22dfbd2d0e2137b0c6691c4a2cc89d5c50bb Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.899898 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-868b778b64-pzgfd" event={"ID":"c6e6e053-3361-4d22-9ef7-fd7e96b77cf4","Type":"ContainerStarted","Data":"df46d28cb458b633856457238c9e25b9a508d8f372e68174f04a2a0f4bcaf73c"} Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.901187 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.901219 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.903025 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77768f5c85-6lgxw" event={"ID":"f2d71d6b-73c6-4edf-8ef2-5295c628603c","Type":"ContainerStarted","Data":"bd6d0bf3f345872b79ad746ffd405fda10cfdddf1515d5ec23034d58fc8f8430"} Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.903069 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77768f5c85-6lgxw" event={"ID":"f2d71d6b-73c6-4edf-8ef2-5295c628603c","Type":"ContainerStarted","Data":"64177f4d6bb3a37d639b7e85038c22dfbd2d0e2137b0c6691c4a2cc89d5c50bb"} Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.915689 4799 generic.go:334] "Generic (PLEG): container finished" podID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerID="88c9011868828ff91e6dc46d7ff538d531a509ae0fe0fd8b03f81917a3ba7333" exitCode=0 Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.915749 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79858f4d8f-lj6r6" event={"ID":"bec976b7-ee8e-46ca-bc02-c47336ee303b","Type":"ContainerDied","Data":"88c9011868828ff91e6dc46d7ff538d531a509ae0fe0fd8b03f81917a3ba7333"} Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.932465 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-868b778b64-pzgfd" podStartSLOduration=2.93245041 podStartE2EDuration="2.93245041s" podCreationTimestamp="2026-03-19 20:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:26.930559598 +0000 UTC m=+1144.536512670" watchObservedRunningTime="2026-03-19 20:24:26.93245041 +0000 UTC m=+1144.538403482" Mar 19 20:24:26 crc kubenswrapper[4799]: I0319 20:24:26.944419 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerStarted","Data":"f2249b8e7bc21392f4ac1d92d84f2c7ed7a7f86f5310883b8d63f4bef5cfb036"} Mar 19 20:24:27 crc kubenswrapper[4799]: I0319 20:24:27.328976 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:24:27 crc kubenswrapper[4799]: I0319 20:24:27.480081 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79858f4d8f-lj6r6" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 19 20:24:27 crc kubenswrapper[4799]: I0319 20:24:27.604998 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:24:27 crc kubenswrapper[4799]: I0319 20:24:27.953791 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77768f5c85-6lgxw" event={"ID":"f2d71d6b-73c6-4edf-8ef2-5295c628603c","Type":"ContainerStarted","Data":"72236f356126d6d9ede1b883a96132e3bf5f1d7c66b7a17e4a7d80875c929ec8"} Mar 19 20:24:27 crc kubenswrapper[4799]: I0319 20:24:27.993189 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77768f5c85-6lgxw" podStartSLOduration=2.993163917 podStartE2EDuration="2.993163917s" podCreationTimestamp="2026-03-19 20:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:27.980768578 +0000 UTC m=+1145.586721650" watchObservedRunningTime="2026-03-19 20:24:27.993163917 +0000 UTC m=+1145.599117029" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.241064 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.315029 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.532615 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.629219 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-z9nhz"] Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.638924 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" containerName="dnsmasq-dns" containerID="cri-o://4833337c696fc7e189b49a5a66bd734547c7528a173f2d684778e1bdab3ab4e8" gracePeriod=10 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.757806 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.757863 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.757909 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.758663 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebf1a7c33ca2e5a253f33af42655dacb79a5142f188b73cc17c9ab070ccc29c9"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.758750 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://ebf1a7c33ca2e5a253f33af42655dacb79a5142f188b73cc17c9ab070ccc29c9" gracePeriod=600 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.981346 4799 generic.go:334] "Generic (PLEG): container finished" podID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerID="3477927a0ef32b69a01e92568fc7afdb648c24c81e3a7b27ab7bdf39cd3e8b8d" exitCode=0 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.981663 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79858f4d8f-lj6r6" event={"ID":"bec976b7-ee8e-46ca-bc02-c47336ee303b","Type":"ContainerDied","Data":"3477927a0ef32b69a01e92568fc7afdb648c24c81e3a7b27ab7bdf39cd3e8b8d"} Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.983935 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="ebf1a7c33ca2e5a253f33af42655dacb79a5142f188b73cc17c9ab070ccc29c9" exitCode=0 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.984062 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"ebf1a7c33ca2e5a253f33af42655dacb79a5142f188b73cc17c9ab070ccc29c9"} Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.984092 4799 scope.go:117] "RemoveContainer" containerID="13b6fab9ed6c0d9132855fd64438b4c86e33e17491f418536e545f20790a7c7a" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.987653 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerStarted","Data":"5103f9e42692b1721fda4a4f6ef3873e8c3ea190cb0c87779f34fb7f0513b524"} Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.988105 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.990851 4799 generic.go:334] "Generic (PLEG): container finished" podID="003320d3-b74f-4b94-9636-6b468817f9f1" containerID="4833337c696fc7e189b49a5a66bd734547c7528a173f2d684778e1bdab3ab4e8" exitCode=0 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.991053 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" event={"ID":"003320d3-b74f-4b94-9636-6b468817f9f1","Type":"ContainerDied","Data":"4833337c696fc7e189b49a5a66bd734547c7528a173f2d684778e1bdab3ab4e8"} Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.997114 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="cinder-scheduler" containerID="cri-o://ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189" gracePeriod=30 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.997273 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="probe" containerID="cri-o://faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2" gracePeriod=30 Mar 19 20:24:28 crc kubenswrapper[4799]: I0319 20:24:28.998044 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.009177 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.092161644 podStartE2EDuration="7.009160713s" podCreationTimestamp="2026-03-19 20:24:22 +0000 UTC" firstStartedPulling="2026-03-19 20:24:24.060091699 +0000 UTC m=+1141.666044771" lastFinishedPulling="2026-03-19 20:24:27.977090768 +0000 UTC m=+1145.583043840" observedRunningTime="2026-03-19 20:24:29.002216953 +0000 UTC m=+1146.608170025" watchObservedRunningTime="2026-03-19 20:24:29.009160713 +0000 UTC m=+1146.615113785" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.315230 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.472056 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-sb\") pod \"003320d3-b74f-4b94-9636-6b468817f9f1\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.472113 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-config\") pod \"003320d3-b74f-4b94-9636-6b468817f9f1\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.472141 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-svc\") pod \"003320d3-b74f-4b94-9636-6b468817f9f1\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.472177 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-swift-storage-0\") pod \"003320d3-b74f-4b94-9636-6b468817f9f1\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.472231 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2l2c\" (UniqueName: \"kubernetes.io/projected/003320d3-b74f-4b94-9636-6b468817f9f1-kube-api-access-d2l2c\") pod \"003320d3-b74f-4b94-9636-6b468817f9f1\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.472277 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-nb\") pod \"003320d3-b74f-4b94-9636-6b468817f9f1\" (UID: \"003320d3-b74f-4b94-9636-6b468817f9f1\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.498289 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003320d3-b74f-4b94-9636-6b468817f9f1-kube-api-access-d2l2c" (OuterVolumeSpecName: "kube-api-access-d2l2c") pod "003320d3-b74f-4b94-9636-6b468817f9f1" (UID: "003320d3-b74f-4b94-9636-6b468817f9f1"). InnerVolumeSpecName "kube-api-access-d2l2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.569698 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.571125 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "003320d3-b74f-4b94-9636-6b468817f9f1" (UID: "003320d3-b74f-4b94-9636-6b468817f9f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.580359 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.580396 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2l2c\" (UniqueName: \"kubernetes.io/projected/003320d3-b74f-4b94-9636-6b468817f9f1-kube-api-access-d2l2c\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.602948 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "003320d3-b74f-4b94-9636-6b468817f9f1" (UID: "003320d3-b74f-4b94-9636-6b468817f9f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.603450 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "003320d3-b74f-4b94-9636-6b468817f9f1" (UID: "003320d3-b74f-4b94-9636-6b468817f9f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.604422 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-config" (OuterVolumeSpecName: "config") pod "003320d3-b74f-4b94-9636-6b468817f9f1" (UID: "003320d3-b74f-4b94-9636-6b468817f9f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.630931 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "003320d3-b74f-4b94-9636-6b468817f9f1" (UID: "003320d3-b74f-4b94-9636-6b468817f9f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.683445 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbzkt\" (UniqueName: \"kubernetes.io/projected/bec976b7-ee8e-46ca-bc02-c47336ee303b-kube-api-access-wbzkt\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.683788 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-httpd-config\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.683807 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-public-tls-certs\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.683886 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-combined-ca-bundle\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.683932 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-config\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.683973 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-internal-tls-certs\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.684017 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-ovndb-tls-certs\") pod \"bec976b7-ee8e-46ca-bc02-c47336ee303b\" (UID: \"bec976b7-ee8e-46ca-bc02-c47336ee303b\") " Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.684414 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.684427 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.684438 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.684446 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003320d3-b74f-4b94-9636-6b468817f9f1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.686706 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.690044 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec976b7-ee8e-46ca-bc02-c47336ee303b-kube-api-access-wbzkt" (OuterVolumeSpecName: "kube-api-access-wbzkt") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "kube-api-access-wbzkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.690447 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.785699 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.785725 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbzkt\" (UniqueName: \"kubernetes.io/projected/bec976b7-ee8e-46ca-bc02-c47336ee303b-kube-api-access-wbzkt\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.790047 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-config" (OuterVolumeSpecName: "config") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.797538 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.819461 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.821022 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.822639 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bec976b7-ee8e-46ca-bc02-c47336ee303b" (UID: "bec976b7-ee8e-46ca-bc02-c47336ee303b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.887885 4799 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.887916 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.887927 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.887935 4799 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.887947 4799 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec976b7-ee8e-46ca-bc02-c47336ee303b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.932081 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56454c8868-kxl79" Mar 19 20:24:29 crc kubenswrapper[4799]: I0319 20:24:29.996649 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8869c89f8-jvpbt"] Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.017484 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79858f4d8f-lj6r6" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.017532 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79858f4d8f-lj6r6" event={"ID":"bec976b7-ee8e-46ca-bc02-c47336ee303b","Type":"ContainerDied","Data":"6eb648c29eae0ea5c2c54ef0f9e9136a0e6c5a96f2c4fb226c46fba12746d11e"} Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.017593 4799 scope.go:117] "RemoveContainer" containerID="88c9011868828ff91e6dc46d7ff538d531a509ae0fe0fd8b03f81917a3ba7333" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.023583 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"896459b668fd89c49f3549689a461713efd29689b0eb95fb693ea8252573a827"} Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.034576 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" event={"ID":"003320d3-b74f-4b94-9636-6b468817f9f1","Type":"ContainerDied","Data":"7965a5a272114b45f75262f4457b1980c97209ee3c4af91f2c2e62cbff2ccb95"} Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.034771 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-z9nhz" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.046589 4799 generic.go:334] "Generic (PLEG): container finished" podID="19271643-f120-41c0-aec0-69316be7c3c2" containerID="faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2" exitCode=0 Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.046863 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8869c89f8-jvpbt" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon-log" containerID="cri-o://ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4" gracePeriod=30 Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.047092 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19271643-f120-41c0-aec0-69316be7c3c2","Type":"ContainerDied","Data":"faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2"} Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.047250 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8869c89f8-jvpbt" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" containerID="cri-o://132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612" gracePeriod=30 Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.102060 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-z9nhz"] Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.114052 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-z9nhz"] Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.114775 4799 scope.go:117] "RemoveContainer" containerID="3477927a0ef32b69a01e92568fc7afdb648c24c81e3a7b27ab7bdf39cd3e8b8d" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.160280 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79858f4d8f-lj6r6"] Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.164293 4799 scope.go:117] "RemoveContainer" containerID="4833337c696fc7e189b49a5a66bd734547c7528a173f2d684778e1bdab3ab4e8" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.168511 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79858f4d8f-lj6r6"] Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.183178 4799 scope.go:117] "RemoveContainer" containerID="341636d22e92d1204032f55e431e97c243b62c7cdb38bde6d518ec898a4c1ced" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.281711 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:30 crc kubenswrapper[4799]: I0319 20:24:30.683761 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.139030 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" path="/var/lib/kubelet/pods/003320d3-b74f-4b94-9636-6b468817f9f1/volumes" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.140725 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" path="/var/lib/kubelet/pods/bec976b7-ee8e-46ca-bc02-c47336ee303b/volumes" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.737353 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.786472 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4tgb\" (UniqueName: \"kubernetes.io/projected/19271643-f120-41c0-aec0-69316be7c3c2-kube-api-access-t4tgb\") pod \"19271643-f120-41c0-aec0-69316be7c3c2\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.786531 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-scripts\") pod \"19271643-f120-41c0-aec0-69316be7c3c2\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.786720 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-combined-ca-bundle\") pod \"19271643-f120-41c0-aec0-69316be7c3c2\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.786742 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19271643-f120-41c0-aec0-69316be7c3c2-etc-machine-id\") pod \"19271643-f120-41c0-aec0-69316be7c3c2\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.786794 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data-custom\") pod \"19271643-f120-41c0-aec0-69316be7c3c2\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.786819 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data\") pod \"19271643-f120-41c0-aec0-69316be7c3c2\" (UID: \"19271643-f120-41c0-aec0-69316be7c3c2\") " Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.787481 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19271643-f120-41c0-aec0-69316be7c3c2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "19271643-f120-41c0-aec0-69316be7c3c2" (UID: "19271643-f120-41c0-aec0-69316be7c3c2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.797273 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-scripts" (OuterVolumeSpecName: "scripts") pod "19271643-f120-41c0-aec0-69316be7c3c2" (UID: "19271643-f120-41c0-aec0-69316be7c3c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.807001 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "19271643-f120-41c0-aec0-69316be7c3c2" (UID: "19271643-f120-41c0-aec0-69316be7c3c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.808270 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19271643-f120-41c0-aec0-69316be7c3c2-kube-api-access-t4tgb" (OuterVolumeSpecName: "kube-api-access-t4tgb") pod "19271643-f120-41c0-aec0-69316be7c3c2" (UID: "19271643-f120-41c0-aec0-69316be7c3c2"). InnerVolumeSpecName "kube-api-access-t4tgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.854406 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19271643-f120-41c0-aec0-69316be7c3c2" (UID: "19271643-f120-41c0-aec0-69316be7c3c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.888884 4799 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.888917 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4tgb\" (UniqueName: \"kubernetes.io/projected/19271643-f120-41c0-aec0-69316be7c3c2-kube-api-access-t4tgb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.888932 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.888943 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.888953 4799 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19271643-f120-41c0-aec0-69316be7c3c2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.891532 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data" (OuterVolumeSpecName: "config-data") pod "19271643-f120-41c0-aec0-69316be7c3c2" (UID: "19271643-f120-41c0-aec0-69316be7c3c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:31 crc kubenswrapper[4799]: I0319 20:24:31.990731 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19271643-f120-41c0-aec0-69316be7c3c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.066631 4799 generic.go:334] "Generic (PLEG): container finished" podID="19271643-f120-41c0-aec0-69316be7c3c2" containerID="ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189" exitCode=0 Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.066673 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19271643-f120-41c0-aec0-69316be7c3c2","Type":"ContainerDied","Data":"ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189"} Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.066729 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"19271643-f120-41c0-aec0-69316be7c3c2","Type":"ContainerDied","Data":"bdcffdc18367be84fd6e845b099606d5bae26e497ce9643471c767807f155c58"} Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.066749 4799 scope.go:117] "RemoveContainer" containerID="faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.066691 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.108198 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.118338 4799 scope.go:117] "RemoveContainer" containerID="ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.128235 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.162367 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.163026 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" containerName="init" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163043 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" containerName="init" Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.163061 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="cinder-scheduler" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163068 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="cinder-scheduler" Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.163113 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-httpd" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163121 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-httpd" Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.163138 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" containerName="dnsmasq-dns" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163144 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" containerName="dnsmasq-dns" Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.163163 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-api" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163168 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-api" Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.163179 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="probe" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163184 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="probe" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163337 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-api" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163352 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="cinder-scheduler" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163364 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="19271643-f120-41c0-aec0-69316be7c3c2" containerName="probe" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163373 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec976b7-ee8e-46ca-bc02-c47336ee303b" containerName="neutron-httpd" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.163398 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="003320d3-b74f-4b94-9636-6b468817f9f1" containerName="dnsmasq-dns" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.164344 4799 scope.go:117] "RemoveContainer" containerID="faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.164410 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.168573 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.175855 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.176005 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2\": container with ID starting with faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2 not found: ID does not exist" containerID="faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.176045 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2"} err="failed to get container status \"faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2\": rpc error: code = NotFound desc = could not find container \"faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2\": container with ID starting with faf6bbeaeb02c787136d6b80f19cca5b109d3488c34730360fd558c257a7fdf2 not found: ID does not exist" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.176087 4799 scope.go:117] "RemoveContainer" containerID="ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189" Mar 19 20:24:32 crc kubenswrapper[4799]: E0319 20:24:32.192179 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189\": container with ID starting with ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189 not found: ID does not exist" containerID="ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.192219 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189"} err="failed to get container status \"ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189\": rpc error: code = NotFound desc = could not find container \"ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189\": container with ID starting with ccd8ec387f5920f977f898c44f82e7f69de5c149c1b5ef63335f144529b0e189 not found: ID does not exist" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.199050 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fnkg\" (UniqueName: \"kubernetes.io/projected/b6c780df-41c8-47d8-af6d-3dcefb770b8d-kube-api-access-4fnkg\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.199088 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.199145 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.199184 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.199219 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c780df-41c8-47d8-af6d-3dcefb770b8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.199239 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.300685 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fnkg\" (UniqueName: \"kubernetes.io/projected/b6c780df-41c8-47d8-af6d-3dcefb770b8d-kube-api-access-4fnkg\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.300735 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.300810 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.300856 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.300910 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c780df-41c8-47d8-af6d-3dcefb770b8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.300938 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.305465 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6c780df-41c8-47d8-af6d-3dcefb770b8d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.308123 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.309012 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.310079 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.310630 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c780df-41c8-47d8-af6d-3dcefb770b8d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.322651 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fnkg\" (UniqueName: \"kubernetes.io/projected/b6c780df-41c8-47d8-af6d-3dcefb770b8d-kube-api-access-4fnkg\") pod \"cinder-scheduler-0\" (UID: \"b6c780df-41c8-47d8-af6d-3dcefb770b8d\") " pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.507715 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 20:24:32 crc kubenswrapper[4799]: I0319 20:24:32.938929 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 20:24:32 crc kubenswrapper[4799]: W0319 20:24:32.944257 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6c780df_41c8_47d8_af6d_3dcefb770b8d.slice/crio-5fbbe42f23762a89e0a381be84c38c9926c19a0087b6560eeb64766950b37313 WatchSource:0}: Error finding container 5fbbe42f23762a89e0a381be84c38c9926c19a0087b6560eeb64766950b37313: Status 404 returned error can't find the container with id 5fbbe42f23762a89e0a381be84c38c9926c19a0087b6560eeb64766950b37313 Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.054424 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.056287 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.079888 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c780df-41c8-47d8-af6d-3dcefb770b8d","Type":"ContainerStarted","Data":"5fbbe42f23762a89e0a381be84c38c9926c19a0087b6560eeb64766950b37313"} Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.151495 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19271643-f120-41c0-aec0-69316be7c3c2" path="/var/lib/kubelet/pods/19271643-f120-41c0-aec0-69316be7c3c2/volumes" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.313445 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-686489978d-5lwnf"] Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.314909 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.328323 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-686489978d-5lwnf"] Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.441589 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-combined-ca-bundle\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.441939 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-public-tls-certs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.442010 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-internal-tls-certs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.442064 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-config-data\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.442128 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-logs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.442145 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-scripts\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.442166 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6jd\" (UniqueName: \"kubernetes.io/projected/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-kube-api-access-cg6jd\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544112 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-config-data\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544194 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-logs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544213 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-scripts\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544242 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6jd\" (UniqueName: \"kubernetes.io/projected/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-kube-api-access-cg6jd\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544281 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-combined-ca-bundle\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544303 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-public-tls-certs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.544376 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-internal-tls-certs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.545169 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-logs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.549714 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-scripts\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.552956 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-public-tls-certs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.563101 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-config-data\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.563909 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-internal-tls-certs\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.568753 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-combined-ca-bundle\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.569022 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6jd\" (UniqueName: \"kubernetes.io/projected/55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3-kube-api-access-cg6jd\") pod \"placement-686489978d-5lwnf\" (UID: \"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3\") " pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:33 crc kubenswrapper[4799]: I0319 20:24:33.698789 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:34 crc kubenswrapper[4799]: I0319 20:24:34.112499 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c780df-41c8-47d8-af6d-3dcefb770b8d","Type":"ContainerStarted","Data":"f2faeec148a08f2b7c3ab19aa3ae37d2818d716d7ec0b65e081b3772a4b3ac02"} Mar 19 20:24:34 crc kubenswrapper[4799]: I0319 20:24:34.129791 4799 generic.go:334] "Generic (PLEG): container finished" podID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerID="132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612" exitCode=0 Mar 19 20:24:34 crc kubenswrapper[4799]: I0319 20:24:34.129833 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8869c89f8-jvpbt" event={"ID":"ecf2a634-2499-4ea6-853f-9c8852d65e01","Type":"ContainerDied","Data":"132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612"} Mar 19 20:24:34 crc kubenswrapper[4799]: I0319 20:24:34.276245 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-686489978d-5lwnf"] Mar 19 20:24:34 crc kubenswrapper[4799]: W0319 20:24:34.292395 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55bf4b6f_6b3e_496d_8e5b_2be9d9fbcbc3.slice/crio-a5e3cb188306c54b8f999f230f8819422803fb2f073c518c08e544fd778d4b0b WatchSource:0}: Error finding container a5e3cb188306c54b8f999f230f8819422803fb2f073c518c08e544fd778d4b0b: Status 404 returned error can't find the container with id a5e3cb188306c54b8f999f230f8819422803fb2f073c518c08e544fd778d4b0b Mar 19 20:24:34 crc kubenswrapper[4799]: I0319 20:24:34.682278 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76f77cb758-wwjbp" Mar 19 20:24:34 crc kubenswrapper[4799]: I0319 20:24:34.984451 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8869c89f8-jvpbt" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.138716 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-686489978d-5lwnf" event={"ID":"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3","Type":"ContainerStarted","Data":"191a8d22ace77faa223776a4084a1e523d4b8a40378310621e76468787d0b6c1"} Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.138760 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-686489978d-5lwnf" event={"ID":"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3","Type":"ContainerStarted","Data":"d547b7f2570e8ad6d4a3de02d803c3ca722efc8b42f358760013df10201b4c26"} Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.138779 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-686489978d-5lwnf" event={"ID":"55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3","Type":"ContainerStarted","Data":"a5e3cb188306c54b8f999f230f8819422803fb2f073c518c08e544fd778d4b0b"} Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.138829 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.138866 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-686489978d-5lwnf" Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.140939 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6c780df-41c8-47d8-af6d-3dcefb770b8d","Type":"ContainerStarted","Data":"8a5eb0ead878dff8a2bdd162db3e534721faa5dd38837411b96296a9ccfaf1d5"} Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.171094 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-686489978d-5lwnf" podStartSLOduration=2.171076897 podStartE2EDuration="2.171076897s" podCreationTimestamp="2026-03-19 20:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:35.164490757 +0000 UTC m=+1152.770443829" watchObservedRunningTime="2026-03-19 20:24:35.171076897 +0000 UTC m=+1152.777029969" Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.209326 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.209307203 podStartE2EDuration="3.209307203s" podCreationTimestamp="2026-03-19 20:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:35.202772514 +0000 UTC m=+1152.808725586" watchObservedRunningTime="2026-03-19 20:24:35.209307203 +0000 UTC m=+1152.815260275" Mar 19 20:24:35 crc kubenswrapper[4799]: I0319 20:24:35.842118 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 20:24:36 crc kubenswrapper[4799]: I0319 20:24:36.792322 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:36 crc kubenswrapper[4799]: I0319 20:24:36.959217 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-868b778b64-pzgfd" Mar 19 20:24:37 crc kubenswrapper[4799]: I0319 20:24:37.039693 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7cbfdb8596-sd77z"] Mar 19 20:24:37 crc kubenswrapper[4799]: I0319 20:24:37.039940 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7cbfdb8596-sd77z" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api-log" containerID="cri-o://c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390" gracePeriod=30 Mar 19 20:24:37 crc kubenswrapper[4799]: I0319 20:24:37.039996 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7cbfdb8596-sd77z" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api" containerID="cri-o://331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca" gracePeriod=30 Mar 19 20:24:37 crc kubenswrapper[4799]: I0319 20:24:37.508695 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.210359 4799 generic.go:334] "Generic (PLEG): container finished" podID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerID="c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390" exitCode=143 Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.210443 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbfdb8596-sd77z" event={"ID":"2fc302d7-a52f-43bd-9d88-06fa20300a1e","Type":"ContainerDied","Data":"c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390"} Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.361817 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.362890 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.365630 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-49grs" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.365841 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.366081 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.375959 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.478330 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkfq\" (UniqueName: \"kubernetes.io/projected/575c8839-3cdb-4137-967a-3544c626113f-kube-api-access-ppkfq\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.478394 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575c8839-3cdb-4137-967a-3544c626113f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.478552 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/575c8839-3cdb-4137-967a-3544c626113f-openstack-config-secret\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.478717 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/575c8839-3cdb-4137-967a-3544c626113f-openstack-config\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.581841 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/575c8839-3cdb-4137-967a-3544c626113f-openstack-config-secret\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.582119 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/575c8839-3cdb-4137-967a-3544c626113f-openstack-config\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.582280 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkfq\" (UniqueName: \"kubernetes.io/projected/575c8839-3cdb-4137-967a-3544c626113f-kube-api-access-ppkfq\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.582311 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575c8839-3cdb-4137-967a-3544c626113f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.583796 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/575c8839-3cdb-4137-967a-3544c626113f-openstack-config\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.592891 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/575c8839-3cdb-4137-967a-3544c626113f-openstack-config-secret\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.597686 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575c8839-3cdb-4137-967a-3544c626113f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.603654 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkfq\" (UniqueName: \"kubernetes.io/projected/575c8839-3cdb-4137-967a-3544c626113f-kube-api-access-ppkfq\") pod \"openstackclient\" (UID: \"575c8839-3cdb-4137-967a-3544c626113f\") " pod="openstack/openstackclient" Mar 19 20:24:38 crc kubenswrapper[4799]: I0319 20:24:38.681640 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 20:24:39 crc kubenswrapper[4799]: W0319 20:24:39.230843 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575c8839_3cdb_4137_967a_3544c626113f.slice/crio-42cca2f5f3f98800100c2ff2a8ea5af3008fa940177709050f3df3953ed078c9 WatchSource:0}: Error finding container 42cca2f5f3f98800100c2ff2a8ea5af3008fa940177709050f3df3953ed078c9: Status 404 returned error can't find the container with id 42cca2f5f3f98800100c2ff2a8ea5af3008fa940177709050f3df3953ed078c9 Mar 19 20:24:39 crc kubenswrapper[4799]: I0319 20:24:39.233230 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:24:39 crc kubenswrapper[4799]: I0319 20:24:39.234079 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 20:24:40 crc kubenswrapper[4799]: I0319 20:24:40.238669 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"575c8839-3cdb-4137-967a-3544c626113f","Type":"ContainerStarted","Data":"42cca2f5f3f98800100c2ff2a8ea5af3008fa940177709050f3df3953ed078c9"} Mar 19 20:24:40 crc kubenswrapper[4799]: I0319 20:24:40.708853 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cbfdb8596-sd77z" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:45232->10.217.0.168:9311: read: connection reset by peer" Mar 19 20:24:40 crc kubenswrapper[4799]: I0319 20:24:40.709145 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cbfdb8596-sd77z" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:45248->10.217.0.168:9311: read: connection reset by peer" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.113256 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.231752 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc302d7-a52f-43bd-9d88-06fa20300a1e-logs\") pod \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.231896 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxgzl\" (UniqueName: \"kubernetes.io/projected/2fc302d7-a52f-43bd-9d88-06fa20300a1e-kube-api-access-vxgzl\") pod \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.231955 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data-custom\") pod \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.231983 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-combined-ca-bundle\") pod \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.232082 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data\") pod \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\" (UID: \"2fc302d7-a52f-43bd-9d88-06fa20300a1e\") " Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.233361 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc302d7-a52f-43bd-9d88-06fa20300a1e-logs" (OuterVolumeSpecName: "logs") pod "2fc302d7-a52f-43bd-9d88-06fa20300a1e" (UID: "2fc302d7-a52f-43bd-9d88-06fa20300a1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.249653 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2fc302d7-a52f-43bd-9d88-06fa20300a1e" (UID: "2fc302d7-a52f-43bd-9d88-06fa20300a1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.250190 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc302d7-a52f-43bd-9d88-06fa20300a1e-kube-api-access-vxgzl" (OuterVolumeSpecName: "kube-api-access-vxgzl") pod "2fc302d7-a52f-43bd-9d88-06fa20300a1e" (UID: "2fc302d7-a52f-43bd-9d88-06fa20300a1e"). InnerVolumeSpecName "kube-api-access-vxgzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.260664 4799 generic.go:334] "Generic (PLEG): container finished" podID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerID="331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca" exitCode=0 Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.260746 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbfdb8596-sd77z" event={"ID":"2fc302d7-a52f-43bd-9d88-06fa20300a1e","Type":"ContainerDied","Data":"331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca"} Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.260827 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbfdb8596-sd77z" event={"ID":"2fc302d7-a52f-43bd-9d88-06fa20300a1e","Type":"ContainerDied","Data":"6db2279c43c56f35e2425be4145a34964ee074316edfb912040f9cb9ab50eaa0"} Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.260888 4799 scope.go:117] "RemoveContainer" containerID="331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.261280 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbfdb8596-sd77z" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.268428 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fc302d7-a52f-43bd-9d88-06fa20300a1e" (UID: "2fc302d7-a52f-43bd-9d88-06fa20300a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.286622 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data" (OuterVolumeSpecName: "config-data") pod "2fc302d7-a52f-43bd-9d88-06fa20300a1e" (UID: "2fc302d7-a52f-43bd-9d88-06fa20300a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.334753 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc302d7-a52f-43bd-9d88-06fa20300a1e-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.334793 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxgzl\" (UniqueName: \"kubernetes.io/projected/2fc302d7-a52f-43bd-9d88-06fa20300a1e-kube-api-access-vxgzl\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.334806 4799 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.334817 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.334829 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc302d7-a52f-43bd-9d88-06fa20300a1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.357166 4799 scope.go:117] "RemoveContainer" containerID="c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.389017 4799 scope.go:117] "RemoveContainer" containerID="331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca" Mar 19 20:24:41 crc kubenswrapper[4799]: E0319 20:24:41.389465 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca\": container with ID starting with 331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca not found: ID does not exist" containerID="331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.389521 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca"} err="failed to get container status \"331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca\": rpc error: code = NotFound desc = could not find container \"331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca\": container with ID starting with 331b921f32738406d3eb1ddd3c928e103eb04450dd80b43e371090e52b9910ca not found: ID does not exist" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.389560 4799 scope.go:117] "RemoveContainer" containerID="c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390" Mar 19 20:24:41 crc kubenswrapper[4799]: E0319 20:24:41.390032 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390\": container with ID starting with c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390 not found: ID does not exist" containerID="c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.390064 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390"} err="failed to get container status \"c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390\": rpc error: code = NotFound desc = could not find container \"c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390\": container with ID starting with c23af3fe9a3798239bdca7c45fb0cac80754159f365940b4be8bbb5300936390 not found: ID does not exist" Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.611782 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7cbfdb8596-sd77z"] Mar 19 20:24:41 crc kubenswrapper[4799]: I0319 20:24:41.621578 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7cbfdb8596-sd77z"] Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.568554 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75d564c56c-vzn6z"] Mar 19 20:24:42 crc kubenswrapper[4799]: E0319 20:24:42.569291 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api-log" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.569306 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api-log" Mar 19 20:24:42 crc kubenswrapper[4799]: E0319 20:24:42.569328 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.569334 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.569526 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.569553 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" containerName="barbican-api-log" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.570470 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.572484 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.573846 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.575139 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.581722 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75d564c56c-vzn6z"] Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.660543 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-combined-ca-bundle\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.660587 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed03c750-cae7-4181-8451-c88d57969c01-run-httpd\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.660617 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-internal-tls-certs\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.660879 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-public-tls-certs\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.660999 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lnw\" (UniqueName: \"kubernetes.io/projected/ed03c750-cae7-4181-8451-c88d57969c01-kube-api-access-l8lnw\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.661136 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed03c750-cae7-4181-8451-c88d57969c01-etc-swift\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.661370 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-config-data\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.661423 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed03c750-cae7-4181-8451-c88d57969c01-log-httpd\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.762876 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-config-data\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.762941 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed03c750-cae7-4181-8451-c88d57969c01-log-httpd\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.762993 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-combined-ca-bundle\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.763011 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed03c750-cae7-4181-8451-c88d57969c01-run-httpd\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.763033 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-internal-tls-certs\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.763090 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-public-tls-certs\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.763134 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lnw\" (UniqueName: \"kubernetes.io/projected/ed03c750-cae7-4181-8451-c88d57969c01-kube-api-access-l8lnw\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.763172 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed03c750-cae7-4181-8451-c88d57969c01-etc-swift\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.764501 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed03c750-cae7-4181-8451-c88d57969c01-run-httpd\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.765556 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed03c750-cae7-4181-8451-c88d57969c01-log-httpd\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.770362 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-internal-tls-certs\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.770530 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ed03c750-cae7-4181-8451-c88d57969c01-etc-swift\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.771060 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-combined-ca-bundle\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.771219 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-config-data\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.778616 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed03c750-cae7-4181-8451-c88d57969c01-public-tls-certs\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.790237 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lnw\" (UniqueName: \"kubernetes.io/projected/ed03c750-cae7-4181-8451-c88d57969c01-kube-api-access-l8lnw\") pod \"swift-proxy-75d564c56c-vzn6z\" (UID: \"ed03c750-cae7-4181-8451-c88d57969c01\") " pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.794704 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 20:24:42 crc kubenswrapper[4799]: I0319 20:24:42.885431 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:43 crc kubenswrapper[4799]: I0319 20:24:43.129494 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc302d7-a52f-43bd-9d88-06fa20300a1e" path="/var/lib/kubelet/pods/2fc302d7-a52f-43bd-9d88-06fa20300a1e/volumes" Mar 19 20:24:43 crc kubenswrapper[4799]: W0319 20:24:43.541830 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded03c750_cae7_4181_8451_c88d57969c01.slice/crio-9be3d8e6d53d89059cf8fbe29c0411ecb976b3a4619a6ea70e7166f9a9846200 WatchSource:0}: Error finding container 9be3d8e6d53d89059cf8fbe29c0411ecb976b3a4619a6ea70e7166f9a9846200: Status 404 returned error can't find the container with id 9be3d8e6d53d89059cf8fbe29c0411ecb976b3a4619a6ea70e7166f9a9846200 Mar 19 20:24:43 crc kubenswrapper[4799]: I0319 20:24:43.555780 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75d564c56c-vzn6z"] Mar 19 20:24:44 crc kubenswrapper[4799]: I0319 20:24:44.314328 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75d564c56c-vzn6z" event={"ID":"ed03c750-cae7-4181-8451-c88d57969c01","Type":"ContainerStarted","Data":"96cae8ca570c448cb084e0ffde8312e2bf6c7934102ad7c263d42e5aa4628752"} Mar 19 20:24:44 crc kubenswrapper[4799]: I0319 20:24:44.314631 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:44 crc kubenswrapper[4799]: I0319 20:24:44.314643 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75d564c56c-vzn6z" event={"ID":"ed03c750-cae7-4181-8451-c88d57969c01","Type":"ContainerStarted","Data":"85ec16d68c4b92249fec6b7676dd786d43df7efe4c9054af5c04be148a193832"} Mar 19 20:24:44 crc kubenswrapper[4799]: I0319 20:24:44.314651 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75d564c56c-vzn6z" event={"ID":"ed03c750-cae7-4181-8451-c88d57969c01","Type":"ContainerStarted","Data":"9be3d8e6d53d89059cf8fbe29c0411ecb976b3a4619a6ea70e7166f9a9846200"} Mar 19 20:24:44 crc kubenswrapper[4799]: I0319 20:24:44.984989 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8869c89f8-jvpbt" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.321621 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.578999 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75d564c56c-vzn6z" podStartSLOduration=3.578976313 podStartE2EDuration="3.578976313s" podCreationTimestamp="2026-03-19 20:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:44.33352197 +0000 UTC m=+1161.939475042" watchObservedRunningTime="2026-03-19 20:24:45.578976313 +0000 UTC m=+1163.184929405" Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.589155 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.589612 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-central-agent" containerID="cri-o://8ca9c629404a820204009b08e62eb4e9eb40372475e797d31cb6065bfc3714d6" gracePeriod=30 Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.590084 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="sg-core" containerID="cri-o://f2249b8e7bc21392f4ac1d92d84f2c7ed7a7f86f5310883b8d63f4bef5cfb036" gracePeriod=30 Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.590171 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="proxy-httpd" containerID="cri-o://5103f9e42692b1721fda4a4f6ef3873e8c3ea190cb0c87779f34fb7f0513b524" gracePeriod=30 Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.590091 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-notification-agent" containerID="cri-o://6464cad4f621fcf55e5ebb42382111dea71d0e723843e30024d6bdd184f20043" gracePeriod=30 Mar 19 20:24:45 crc kubenswrapper[4799]: I0319 20:24:45.627539 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.171:3000/\": read tcp 10.217.0.2:49076->10.217.0.171:3000: read: connection reset by peer" Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330368 4799 generic.go:334] "Generic (PLEG): container finished" podID="6d3a79a1-4358-46e0-b444-151927715b1a" containerID="5103f9e42692b1721fda4a4f6ef3873e8c3ea190cb0c87779f34fb7f0513b524" exitCode=0 Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330414 4799 generic.go:334] "Generic (PLEG): container finished" podID="6d3a79a1-4358-46e0-b444-151927715b1a" containerID="f2249b8e7bc21392f4ac1d92d84f2c7ed7a7f86f5310883b8d63f4bef5cfb036" exitCode=2 Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330422 4799 generic.go:334] "Generic (PLEG): container finished" podID="6d3a79a1-4358-46e0-b444-151927715b1a" containerID="6464cad4f621fcf55e5ebb42382111dea71d0e723843e30024d6bdd184f20043" exitCode=0 Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330429 4799 generic.go:334] "Generic (PLEG): container finished" podID="6d3a79a1-4358-46e0-b444-151927715b1a" containerID="8ca9c629404a820204009b08e62eb4e9eb40372475e797d31cb6065bfc3714d6" exitCode=0 Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330432 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerDied","Data":"5103f9e42692b1721fda4a4f6ef3873e8c3ea190cb0c87779f34fb7f0513b524"} Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330476 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerDied","Data":"f2249b8e7bc21392f4ac1d92d84f2c7ed7a7f86f5310883b8d63f4bef5cfb036"} Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330487 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerDied","Data":"6464cad4f621fcf55e5ebb42382111dea71d0e723843e30024d6bdd184f20043"} Mar 19 20:24:46 crc kubenswrapper[4799]: I0319 20:24:46.330496 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerDied","Data":"8ca9c629404a820204009b08e62eb4e9eb40372475e797d31cb6065bfc3714d6"} Mar 19 20:24:48 crc kubenswrapper[4799]: I0319 20:24:48.660951 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:48 crc kubenswrapper[4799]: I0319 20:24:48.661620 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-log" containerID="cri-o://268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f" gracePeriod=30 Mar 19 20:24:48 crc kubenswrapper[4799]: I0319 20:24:48.661766 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-httpd" containerID="cri-o://e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42" gracePeriod=30 Mar 19 20:24:49 crc kubenswrapper[4799]: I0319 20:24:49.355277 4799 generic.go:334] "Generic (PLEG): container finished" podID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerID="268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f" exitCode=143 Mar 19 20:24:49 crc kubenswrapper[4799]: I0319 20:24:49.355406 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9f48b3d-c638-4337-b3f7-c599bcf7ef72","Type":"ContainerDied","Data":"268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f"} Mar 19 20:24:49 crc kubenswrapper[4799]: I0319 20:24:49.948090 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115167 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-run-httpd\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115261 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-config-data\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115349 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-scripts\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115462 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-sg-core-conf-yaml\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115490 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-log-httpd\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115556 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5s9b\" (UniqueName: \"kubernetes.io/projected/6d3a79a1-4358-46e0-b444-151927715b1a-kube-api-access-f5s9b\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.115592 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-combined-ca-bundle\") pod \"6d3a79a1-4358-46e0-b444-151927715b1a\" (UID: \"6d3a79a1-4358-46e0-b444-151927715b1a\") " Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.116990 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.118619 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.122519 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-scripts" (OuterVolumeSpecName: "scripts") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.122576 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3a79a1-4358-46e0-b444-151927715b1a-kube-api-access-f5s9b" (OuterVolumeSpecName: "kube-api-access-f5s9b") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "kube-api-access-f5s9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.151745 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.190769 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.217111 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-config-data" (OuterVolumeSpecName: "config-data") pod "6d3a79a1-4358-46e0-b444-151927715b1a" (UID: "6d3a79a1-4358-46e0-b444-151927715b1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218203 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218232 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218248 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218263 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5s9b\" (UniqueName: \"kubernetes.io/projected/6d3a79a1-4358-46e0-b444-151927715b1a-kube-api-access-f5s9b\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218277 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218288 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d3a79a1-4358-46e0-b444-151927715b1a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.218299 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3a79a1-4358-46e0-b444-151927715b1a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.369115 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d3a79a1-4358-46e0-b444-151927715b1a","Type":"ContainerDied","Data":"eefe6dd1623541a090d1079ccf82bf7ed8cdc9b7f3f4855ff3e6ef65530bc16f"} Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.369173 4799 scope.go:117] "RemoveContainer" containerID="5103f9e42692b1721fda4a4f6ef3873e8c3ea190cb0c87779f34fb7f0513b524" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.369181 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.372365 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"575c8839-3cdb-4137-967a-3544c626113f","Type":"ContainerStarted","Data":"725435a5e7d8844cd87da5c5293ae898afa23e74c54662e295cdd73a9ea11418"} Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.403091 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.9957766289999999 podStartE2EDuration="12.403072077s" podCreationTimestamp="2026-03-19 20:24:38 +0000 UTC" firstStartedPulling="2026-03-19 20:24:39.233034312 +0000 UTC m=+1156.838987384" lastFinishedPulling="2026-03-19 20:24:49.64032974 +0000 UTC m=+1167.246282832" observedRunningTime="2026-03-19 20:24:50.398101801 +0000 UTC m=+1168.004054873" watchObservedRunningTime="2026-03-19 20:24:50.403072077 +0000 UTC m=+1168.009025149" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.405257 4799 scope.go:117] "RemoveContainer" containerID="f2249b8e7bc21392f4ac1d92d84f2c7ed7a7f86f5310883b8d63f4bef5cfb036" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.424913 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.434468 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.440070 4799 scope.go:117] "RemoveContainer" containerID="6464cad4f621fcf55e5ebb42382111dea71d0e723843e30024d6bdd184f20043" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.449640 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:50 crc kubenswrapper[4799]: E0319 20:24:50.450003 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="proxy-httpd" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450015 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="proxy-httpd" Mar 19 20:24:50 crc kubenswrapper[4799]: E0319 20:24:50.450026 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-central-agent" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450032 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-central-agent" Mar 19 20:24:50 crc kubenswrapper[4799]: E0319 20:24:50.450055 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="sg-core" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450060 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="sg-core" Mar 19 20:24:50 crc kubenswrapper[4799]: E0319 20:24:50.450074 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-notification-agent" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450080 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-notification-agent" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450240 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-central-agent" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450263 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="proxy-httpd" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450278 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="sg-core" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.450291 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" containerName="ceilometer-notification-agent" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.452612 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.454999 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.455259 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.457472 4799 scope.go:117] "RemoveContainer" containerID="8ca9c629404a820204009b08e62eb4e9eb40372475e797d31cb6065bfc3714d6" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.475234 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624744 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624840 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624878 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-run-httpd\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624895 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsmk\" (UniqueName: \"kubernetes.io/projected/77915c6c-9c63-4784-a290-7c25b984d08b-kube-api-access-7rsmk\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624916 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-log-httpd\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624966 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-config-data\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.624989 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-scripts\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726668 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-config-data\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726738 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-scripts\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726787 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726865 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726931 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-run-httpd\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726951 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsmk\" (UniqueName: \"kubernetes.io/projected/77915c6c-9c63-4784-a290-7c25b984d08b-kube-api-access-7rsmk\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.726981 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-log-httpd\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.727577 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-log-httpd\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.727721 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-run-httpd\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.732715 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.733185 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-config-data\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.736718 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.737501 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-scripts\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.750340 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsmk\" (UniqueName: \"kubernetes.io/projected/77915c6c-9c63-4784-a290-7c25b984d08b-kube-api-access-7rsmk\") pod \"ceilometer-0\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.766977 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.826879 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2msv6"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.827981 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.843680 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2msv6"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.934912 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lccb6"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.936310 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.936661 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-operator-scripts\") pod \"nova-api-db-create-2msv6\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.936814 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v27n\" (UniqueName: \"kubernetes.io/projected/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-kube-api-access-7v27n\") pod \"nova-api-db-create-2msv6\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.951168 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e7c2-account-create-update-szzhc"] Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.953348 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:50 crc kubenswrapper[4799]: I0319 20:24:50.958682 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.012323 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lccb6"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.042888 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e7c2-account-create-update-szzhc"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.044365 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c920492a-2fc5-4531-9eff-538a52f5d3de-operator-scripts\") pod \"nova-cell0-db-create-lccb6\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.044431 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5q85\" (UniqueName: \"kubernetes.io/projected/c920492a-2fc5-4531-9eff-538a52f5d3de-kube-api-access-l5q85\") pod \"nova-cell0-db-create-lccb6\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.044481 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-operator-scripts\") pod \"nova-api-db-create-2msv6\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.044566 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v27n\" (UniqueName: \"kubernetes.io/projected/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-kube-api-access-7v27n\") pod \"nova-api-db-create-2msv6\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.045887 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-operator-scripts\") pod \"nova-api-db-create-2msv6\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.073076 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v27n\" (UniqueName: \"kubernetes.io/projected/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-kube-api-access-7v27n\") pod \"nova-api-db-create-2msv6\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.081426 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8skh8"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.082660 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.101671 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8skh8"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.135691 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3a79a1-4358-46e0-b444-151927715b1a" path="/var/lib/kubelet/pods/6d3a79a1-4358-46e0-b444-151927715b1a/volumes" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.145467 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgmb\" (UniqueName: \"kubernetes.io/projected/a750507b-f4c5-4327-ad80-5b20b8740bef-kube-api-access-gkgmb\") pod \"nova-api-e7c2-account-create-update-szzhc\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.145557 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c920492a-2fc5-4531-9eff-538a52f5d3de-operator-scripts\") pod \"nova-cell0-db-create-lccb6\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.145589 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5q85\" (UniqueName: \"kubernetes.io/projected/c920492a-2fc5-4531-9eff-538a52f5d3de-kube-api-access-l5q85\") pod \"nova-cell0-db-create-lccb6\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.145608 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a750507b-f4c5-4327-ad80-5b20b8740bef-operator-scripts\") pod \"nova-api-e7c2-account-create-update-szzhc\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.146687 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e217-account-create-update-ww4cp"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.146709 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c920492a-2fc5-4531-9eff-538a52f5d3de-operator-scripts\") pod \"nova-cell0-db-create-lccb6\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.148118 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.150309 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.155261 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e217-account-create-update-ww4cp"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.177358 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5q85\" (UniqueName: \"kubernetes.io/projected/c920492a-2fc5-4531-9eff-538a52f5d3de-kube-api-access-l5q85\") pod \"nova-cell0-db-create-lccb6\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.240604 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.246802 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a750507b-f4c5-4327-ad80-5b20b8740bef-operator-scripts\") pod \"nova-api-e7c2-account-create-update-szzhc\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.246870 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630a3885-5144-4f12-9488-b51346e29dee-operator-scripts\") pod \"nova-cell0-e217-account-create-update-ww4cp\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.246909 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmlg\" (UniqueName: \"kubernetes.io/projected/630a3885-5144-4f12-9488-b51346e29dee-kube-api-access-stmlg\") pod \"nova-cell0-e217-account-create-update-ww4cp\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.246959 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgmb\" (UniqueName: \"kubernetes.io/projected/a750507b-f4c5-4327-ad80-5b20b8740bef-kube-api-access-gkgmb\") pod \"nova-api-e7c2-account-create-update-szzhc\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.246998 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgsh\" (UniqueName: \"kubernetes.io/projected/819444ce-2f1f-4970-b22d-80cd3bf90a1d-kube-api-access-nqgsh\") pod \"nova-cell1-db-create-8skh8\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.247025 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/819444ce-2f1f-4970-b22d-80cd3bf90a1d-operator-scripts\") pod \"nova-cell1-db-create-8skh8\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.249600 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a750507b-f4c5-4327-ad80-5b20b8740bef-operator-scripts\") pod \"nova-api-e7c2-account-create-update-szzhc\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.268257 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgmb\" (UniqueName: \"kubernetes.io/projected/a750507b-f4c5-4327-ad80-5b20b8740bef-kube-api-access-gkgmb\") pod \"nova-api-e7c2-account-create-update-szzhc\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.277422 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.288806 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.303569 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.336025 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-de1e-account-create-update-t6tzl"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.337579 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.344372 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.349732 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630a3885-5144-4f12-9488-b51346e29dee-operator-scripts\") pod \"nova-cell0-e217-account-create-update-ww4cp\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.349795 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmlg\" (UniqueName: \"kubernetes.io/projected/630a3885-5144-4f12-9488-b51346e29dee-kube-api-access-stmlg\") pod \"nova-cell0-e217-account-create-update-ww4cp\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.349884 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqgsh\" (UniqueName: \"kubernetes.io/projected/819444ce-2f1f-4970-b22d-80cd3bf90a1d-kube-api-access-nqgsh\") pod \"nova-cell1-db-create-8skh8\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.349929 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/819444ce-2f1f-4970-b22d-80cd3bf90a1d-operator-scripts\") pod \"nova-cell1-db-create-8skh8\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.350738 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/819444ce-2f1f-4970-b22d-80cd3bf90a1d-operator-scripts\") pod \"nova-cell1-db-create-8skh8\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.351214 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630a3885-5144-4f12-9488-b51346e29dee-operator-scripts\") pod \"nova-cell0-e217-account-create-update-ww4cp\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.365833 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-de1e-account-create-update-t6tzl"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.373850 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqgsh\" (UniqueName: \"kubernetes.io/projected/819444ce-2f1f-4970-b22d-80cd3bf90a1d-kube-api-access-nqgsh\") pod \"nova-cell1-db-create-8skh8\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.374514 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmlg\" (UniqueName: \"kubernetes.io/projected/630a3885-5144-4f12-9488-b51346e29dee-kube-api-access-stmlg\") pod \"nova-cell0-e217-account-create-update-ww4cp\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.388349 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerStarted","Data":"687232ec8bca99aa08b91e092a54848a68f8578f45b462923c1ae71d64d21c8e"} Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.441987 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.451886 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bc4\" (UniqueName: \"kubernetes.io/projected/19f679b2-054e-4fd3-9b93-276ffd1a45b6-kube-api-access-r2bc4\") pod \"nova-cell1-de1e-account-create-update-t6tzl\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.452025 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f679b2-054e-4fd3-9b93-276ffd1a45b6-operator-scripts\") pod \"nova-cell1-de1e-account-create-update-t6tzl\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.485108 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.553375 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bc4\" (UniqueName: \"kubernetes.io/projected/19f679b2-054e-4fd3-9b93-276ffd1a45b6-kube-api-access-r2bc4\") pod \"nova-cell1-de1e-account-create-update-t6tzl\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.553577 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f679b2-054e-4fd3-9b93-276ffd1a45b6-operator-scripts\") pod \"nova-cell1-de1e-account-create-update-t6tzl\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.555543 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f679b2-054e-4fd3-9b93-276ffd1a45b6-operator-scripts\") pod \"nova-cell1-de1e-account-create-update-t6tzl\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.580350 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bc4\" (UniqueName: \"kubernetes.io/projected/19f679b2-054e-4fd3-9b93-276ffd1a45b6-kube-api-access-r2bc4\") pod \"nova-cell1-de1e-account-create-update-t6tzl\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.671920 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.721919 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2msv6"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.852359 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lccb6"] Mar 19 20:24:51 crc kubenswrapper[4799]: I0319 20:24:51.870194 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e7c2-account-create-update-szzhc"] Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.008237 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8skh8"] Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.025407 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e217-account-create-update-ww4cp"] Mar 19 20:24:52 crc kubenswrapper[4799]: W0319 20:24:52.033577 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod819444ce_2f1f_4970_b22d_80cd3bf90a1d.slice/crio-82870a1ca7447c7245b6bab5e23b44bf1d60417482028dfc737150e89f050cd0 WatchSource:0}: Error finding container 82870a1ca7447c7245b6bab5e23b44bf1d60417482028dfc737150e89f050cd0: Status 404 returned error can't find the container with id 82870a1ca7447c7245b6bab5e23b44bf1d60417482028dfc737150e89f050cd0 Mar 19 20:24:52 crc kubenswrapper[4799]: W0319 20:24:52.033944 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630a3885_5144_4f12_9488_b51346e29dee.slice/crio-2da5be8d00470d02de914910820f68a646b8419ad876d6d3e348f36fb761db31 WatchSource:0}: Error finding container 2da5be8d00470d02de914910820f68a646b8419ad876d6d3e348f36fb761db31: Status 404 returned error can't find the container with id 2da5be8d00470d02de914910820f68a646b8419ad876d6d3e348f36fb761db31 Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.216029 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-de1e-account-create-update-t6tzl"] Mar 19 20:24:52 crc kubenswrapper[4799]: W0319 20:24:52.258631 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19f679b2_054e_4fd3_9b93_276ffd1a45b6.slice/crio-e8b7316a785087e74a5150c8755f5c1402dcb46a9e25b98fcb2fc4f4fba4200e WatchSource:0}: Error finding container e8b7316a785087e74a5150c8755f5c1402dcb46a9e25b98fcb2fc4f4fba4200e: Status 404 returned error can't find the container with id e8b7316a785087e74a5150c8755f5c1402dcb46a9e25b98fcb2fc4f4fba4200e Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.366376 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.464041 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" event={"ID":"630a3885-5144-4f12-9488-b51346e29dee","Type":"ContainerStarted","Data":"2da5be8d00470d02de914910820f68a646b8419ad876d6d3e348f36fb761db31"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.470984 4799 generic.go:334] "Generic (PLEG): container finished" podID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerID="e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42" exitCode=0 Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.471060 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9f48b3d-c638-4337-b3f7-c599bcf7ef72","Type":"ContainerDied","Data":"e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.471088 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9f48b3d-c638-4337-b3f7-c599bcf7ef72","Type":"ContainerDied","Data":"ceda4e3b87a68da048e50594e55ccf16030a4f96471fe0e9b6f01caa2486b91d"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.471112 4799 scope.go:117] "RemoveContainer" containerID="e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.471271 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478230 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-internal-tls-certs\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478369 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-httpd-run\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478412 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-logs\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478431 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-scripts\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478485 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwxh5\" (UniqueName: \"kubernetes.io/projected/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-kube-api-access-fwxh5\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478509 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-combined-ca-bundle\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478566 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-config-data\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.478792 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\" (UID: \"c9f48b3d-c638-4337-b3f7-c599bcf7ef72\") " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.480786 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.482828 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-logs" (OuterVolumeSpecName: "logs") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.485315 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" event={"ID":"19f679b2-054e-4fd3-9b93-276ffd1a45b6","Type":"ContainerStarted","Data":"e8b7316a785087e74a5150c8755f5c1402dcb46a9e25b98fcb2fc4f4fba4200e"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.491763 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lccb6" event={"ID":"c920492a-2fc5-4531-9eff-538a52f5d3de","Type":"ContainerStarted","Data":"26d980a05014aaec753f74f5670f922f956324227822f53c0a2b4d0d108325d8"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.491819 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lccb6" event={"ID":"c920492a-2fc5-4531-9eff-538a52f5d3de","Type":"ContainerStarted","Data":"6a99817ec82af7f97f6d2fdd940059d7edf81635f07ec369fb51a7fa8fc66c94"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.501517 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8skh8" event={"ID":"819444ce-2f1f-4970-b22d-80cd3bf90a1d","Type":"ContainerStarted","Data":"82870a1ca7447c7245b6bab5e23b44bf1d60417482028dfc737150e89f050cd0"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.502604 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e7c2-account-create-update-szzhc" event={"ID":"a750507b-f4c5-4327-ad80-5b20b8740bef","Type":"ContainerStarted","Data":"acec738dab35ffd64311d9be78abd168562593e47b895394daa37ef84ac4b1aa"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.504833 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2msv6" event={"ID":"bcb48c3f-ff08-41c9-b3b9-6d974dc85797","Type":"ContainerStarted","Data":"2403c6a66e3746f55049f0e96b136599fa3c88bdaf68f346b55229bc68180da3"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.504855 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2msv6" event={"ID":"bcb48c3f-ff08-41c9-b3b9-6d974dc85797","Type":"ContainerStarted","Data":"2c8bfc141034b435dfec96271ac548cb2bacbccbc23f59e31e784b0178627ea1"} Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.520092 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-lccb6" podStartSLOduration=2.5200707639999997 podStartE2EDuration="2.520070764s" podCreationTimestamp="2026-03-19 20:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:52.509220777 +0000 UTC m=+1170.115173849" watchObservedRunningTime="2026-03-19 20:24:52.520070764 +0000 UTC m=+1170.126023836" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.522761 4799 scope.go:117] "RemoveContainer" containerID="268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.525973 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.528432 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-scripts" (OuterVolumeSpecName: "scripts") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.541999 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-kube-api-access-fwxh5" (OuterVolumeSpecName: "kube-api-access-fwxh5") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "kube-api-access-fwxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.580610 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.581334 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.581527 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.581588 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.581651 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwxh5\" (UniqueName: \"kubernetes.io/projected/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-kube-api-access-fwxh5\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.661242 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.683579 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.690112 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.773697 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-config-data" (OuterVolumeSpecName: "config-data") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.786307 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.786338 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.789223 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9f48b3d-c638-4337-b3f7-c599bcf7ef72" (UID: "c9f48b3d-c638-4337-b3f7-c599bcf7ef72"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.887819 4799 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9f48b3d-c638-4337-b3f7-c599bcf7ef72-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.892030 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:52 crc kubenswrapper[4799]: I0319 20:24:52.893553 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75d564c56c-vzn6z" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.077883 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.187676 4799 scope.go:117] "RemoveContainer" containerID="e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42" Mar 19 20:24:53 crc kubenswrapper[4799]: E0319 20:24:53.191230 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42\": container with ID starting with e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42 not found: ID does not exist" containerID="e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.191267 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42"} err="failed to get container status \"e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42\": rpc error: code = NotFound desc = could not find container \"e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42\": container with ID starting with e20342e337c6b062682d6f21bb72e3c70e507a0dda50c18c43a6e796ce4eae42 not found: ID does not exist" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.191292 4799 scope.go:117] "RemoveContainer" containerID="268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f" Mar 19 20:24:53 crc kubenswrapper[4799]: E0319 20:24:53.194806 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f\": container with ID starting with 268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f not found: ID does not exist" containerID="268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.194861 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f"} err="failed to get container status \"268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f\": rpc error: code = NotFound desc = could not find container \"268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f\": container with ID starting with 268fe46264fd6d8d297ef464f2c52b27efe3413533d964fcacaeda32797bb94f not found: ID does not exist" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.227566 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.252960 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.263685 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:53 crc kubenswrapper[4799]: E0319 20:24:53.264135 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-httpd" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.264155 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-httpd" Mar 19 20:24:53 crc kubenswrapper[4799]: E0319 20:24:53.264181 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-log" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.264188 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-log" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.264418 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-httpd" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.264438 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" containerName="glance-log" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.267107 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.269425 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.271265 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.273416 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307136 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4455494-ef4e-4f95-87d6-cd495059bb9a-logs\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307170 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhfc\" (UniqueName: \"kubernetes.io/projected/a4455494-ef4e-4f95-87d6-cd495059bb9a-kube-api-access-5zhfc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307198 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307237 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307337 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307352 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307453 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.307486 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4455494-ef4e-4f95-87d6-cd495059bb9a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.408876 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4455494-ef4e-4f95-87d6-cd495059bb9a-logs\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.408914 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhfc\" (UniqueName: \"kubernetes.io/projected/a4455494-ef4e-4f95-87d6-cd495059bb9a-kube-api-access-5zhfc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.408935 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.408972 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.409029 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.409051 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.409118 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.409144 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4455494-ef4e-4f95-87d6-cd495059bb9a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.409612 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4455494-ef4e-4f95-87d6-cd495059bb9a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.409823 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4455494-ef4e-4f95-87d6-cd495059bb9a-logs\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.410617 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.416171 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.416621 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.417144 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.417992 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4455494-ef4e-4f95-87d6-cd495059bb9a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.426802 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhfc\" (UniqueName: \"kubernetes.io/projected/a4455494-ef4e-4f95-87d6-cd495059bb9a-kube-api-access-5zhfc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.437111 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4455494-ef4e-4f95-87d6-cd495059bb9a\") " pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.520527 4799 generic.go:334] "Generic (PLEG): container finished" podID="819444ce-2f1f-4970-b22d-80cd3bf90a1d" containerID="271bb4327b0578e5b0ddc914d0d2bc1565e6e54463224880ac06d414ef06cb9a" exitCode=0 Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.520610 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8skh8" event={"ID":"819444ce-2f1f-4970-b22d-80cd3bf90a1d","Type":"ContainerDied","Data":"271bb4327b0578e5b0ddc914d0d2bc1565e6e54463224880ac06d414ef06cb9a"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.522545 4799 generic.go:334] "Generic (PLEG): container finished" podID="a750507b-f4c5-4327-ad80-5b20b8740bef" containerID="f7a314e611b6d56bace35ef21892d25f624e26145034811bfe59eeea49762ccf" exitCode=0 Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.522661 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e7c2-account-create-update-szzhc" event={"ID":"a750507b-f4c5-4327-ad80-5b20b8740bef","Type":"ContainerDied","Data":"f7a314e611b6d56bace35ef21892d25f624e26145034811bfe59eeea49762ccf"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.524789 4799 generic.go:334] "Generic (PLEG): container finished" podID="bcb48c3f-ff08-41c9-b3b9-6d974dc85797" containerID="2403c6a66e3746f55049f0e96b136599fa3c88bdaf68f346b55229bc68180da3" exitCode=0 Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.524825 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2msv6" event={"ID":"bcb48c3f-ff08-41c9-b3b9-6d974dc85797","Type":"ContainerDied","Data":"2403c6a66e3746f55049f0e96b136599fa3c88bdaf68f346b55229bc68180da3"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.530376 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerStarted","Data":"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.530452 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerStarted","Data":"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.539289 4799 generic.go:334] "Generic (PLEG): container finished" podID="630a3885-5144-4f12-9488-b51346e29dee" containerID="da2eda38a0f2530838cfdd31a9c72be9d2a8499a055ed7272cb62f0aad9943a3" exitCode=0 Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.539457 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" event={"ID":"630a3885-5144-4f12-9488-b51346e29dee","Type":"ContainerDied","Data":"da2eda38a0f2530838cfdd31a9c72be9d2a8499a055ed7272cb62f0aad9943a3"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.543667 4799 generic.go:334] "Generic (PLEG): container finished" podID="19f679b2-054e-4fd3-9b93-276ffd1a45b6" containerID="d0d006f7cd56602aa308c1ec73c132a5490886e9b3ab340eccc945dbafcbe5ba" exitCode=0 Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.543725 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" event={"ID":"19f679b2-054e-4fd3-9b93-276ffd1a45b6","Type":"ContainerDied","Data":"d0d006f7cd56602aa308c1ec73c132a5490886e9b3ab340eccc945dbafcbe5ba"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.550725 4799 generic.go:334] "Generic (PLEG): container finished" podID="c920492a-2fc5-4531-9eff-538a52f5d3de" containerID="26d980a05014aaec753f74f5670f922f956324227822f53c0a2b4d0d108325d8" exitCode=0 Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.551745 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lccb6" event={"ID":"c920492a-2fc5-4531-9eff-538a52f5d3de","Type":"ContainerDied","Data":"26d980a05014aaec753f74f5670f922f956324227822f53c0a2b4d0d108325d8"} Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.598160 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 20:24:53 crc kubenswrapper[4799]: I0319 20:24:53.955612 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.032963 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v27n\" (UniqueName: \"kubernetes.io/projected/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-kube-api-access-7v27n\") pod \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.033201 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-operator-scripts\") pod \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\" (UID: \"bcb48c3f-ff08-41c9-b3b9-6d974dc85797\") " Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.034021 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcb48c3f-ff08-41c9-b3b9-6d974dc85797" (UID: "bcb48c3f-ff08-41c9-b3b9-6d974dc85797"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.038044 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-kube-api-access-7v27n" (OuterVolumeSpecName: "kube-api-access-7v27n") pod "bcb48c3f-ff08-41c9-b3b9-6d974dc85797" (UID: "bcb48c3f-ff08-41c9-b3b9-6d974dc85797"). InnerVolumeSpecName "kube-api-access-7v27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.135631 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.135663 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v27n\" (UniqueName: \"kubernetes.io/projected/bcb48c3f-ff08-41c9-b3b9-6d974dc85797-kube-api-access-7v27n\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.265002 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 20:24:54 crc kubenswrapper[4799]: W0319 20:24:54.276608 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4455494_ef4e_4f95_87d6_cd495059bb9a.slice/crio-4948092b2d109121fbd651c3d94ad742ad2bb1c612a5be507a297d3a4ef73fef WatchSource:0}: Error finding container 4948092b2d109121fbd651c3d94ad742ad2bb1c612a5be507a297d3a4ef73fef: Status 404 returned error can't find the container with id 4948092b2d109121fbd651c3d94ad742ad2bb1c612a5be507a297d3a4ef73fef Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.569117 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2msv6" event={"ID":"bcb48c3f-ff08-41c9-b3b9-6d974dc85797","Type":"ContainerDied","Data":"2c8bfc141034b435dfec96271ac548cb2bacbccbc23f59e31e784b0178627ea1"} Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.569157 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8bfc141034b435dfec96271ac548cb2bacbccbc23f59e31e784b0178627ea1" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.569191 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2msv6" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.570895 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerStarted","Data":"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae"} Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.572577 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a4455494-ef4e-4f95-87d6-cd495059bb9a","Type":"ContainerStarted","Data":"4948092b2d109121fbd651c3d94ad742ad2bb1c612a5be507a297d3a4ef73fef"} Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.947107 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.979825 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a750507b-f4c5-4327-ad80-5b20b8740bef-operator-scripts\") pod \"a750507b-f4c5-4327-ad80-5b20b8740bef\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.979874 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkgmb\" (UniqueName: \"kubernetes.io/projected/a750507b-f4c5-4327-ad80-5b20b8740bef-kube-api-access-gkgmb\") pod \"a750507b-f4c5-4327-ad80-5b20b8740bef\" (UID: \"a750507b-f4c5-4327-ad80-5b20b8740bef\") " Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.980338 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a750507b-f4c5-4327-ad80-5b20b8740bef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a750507b-f4c5-4327-ad80-5b20b8740bef" (UID: "a750507b-f4c5-4327-ad80-5b20b8740bef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.992511 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a750507b-f4c5-4327-ad80-5b20b8740bef-kube-api-access-gkgmb" (OuterVolumeSpecName: "kube-api-access-gkgmb") pod "a750507b-f4c5-4327-ad80-5b20b8740bef" (UID: "a750507b-f4c5-4327-ad80-5b20b8740bef"). InnerVolumeSpecName "kube-api-access-gkgmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.992620 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8869c89f8-jvpbt" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 20:24:54 crc kubenswrapper[4799]: I0319 20:24:54.992724 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.088712 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a750507b-f4c5-4327-ad80-5b20b8740bef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.088742 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkgmb\" (UniqueName: \"kubernetes.io/projected/a750507b-f4c5-4327-ad80-5b20b8740bef-kube-api-access-gkgmb\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.133159 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f48b3d-c638-4337-b3f7-c599bcf7ef72" path="/var/lib/kubelet/pods/c9f48b3d-c638-4337-b3f7-c599bcf7ef72/volumes" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.149487 4799 scope.go:117] "RemoveContainer" containerID="3b17faeda0064e60c2e0c03ef3629515e8156ae2db55a7bd5ffefc54b881fffb" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.218627 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.225679 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.239667 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.243272 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295069 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c920492a-2fc5-4531-9eff-538a52f5d3de-operator-scripts\") pod \"c920492a-2fc5-4531-9eff-538a52f5d3de\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295361 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stmlg\" (UniqueName: \"kubernetes.io/projected/630a3885-5144-4f12-9488-b51346e29dee-kube-api-access-stmlg\") pod \"630a3885-5144-4f12-9488-b51346e29dee\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295408 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/819444ce-2f1f-4970-b22d-80cd3bf90a1d-operator-scripts\") pod \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295443 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f679b2-054e-4fd3-9b93-276ffd1a45b6-operator-scripts\") pod \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295480 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5q85\" (UniqueName: \"kubernetes.io/projected/c920492a-2fc5-4531-9eff-538a52f5d3de-kube-api-access-l5q85\") pod \"c920492a-2fc5-4531-9eff-538a52f5d3de\" (UID: \"c920492a-2fc5-4531-9eff-538a52f5d3de\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295499 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqgsh\" (UniqueName: \"kubernetes.io/projected/819444ce-2f1f-4970-b22d-80cd3bf90a1d-kube-api-access-nqgsh\") pod \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\" (UID: \"819444ce-2f1f-4970-b22d-80cd3bf90a1d\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295537 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630a3885-5144-4f12-9488-b51346e29dee-operator-scripts\") pod \"630a3885-5144-4f12-9488-b51346e29dee\" (UID: \"630a3885-5144-4f12-9488-b51346e29dee\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.295562 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2bc4\" (UniqueName: \"kubernetes.io/projected/19f679b2-054e-4fd3-9b93-276ffd1a45b6-kube-api-access-r2bc4\") pod \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\" (UID: \"19f679b2-054e-4fd3-9b93-276ffd1a45b6\") " Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.296627 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819444ce-2f1f-4970-b22d-80cd3bf90a1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "819444ce-2f1f-4970-b22d-80cd3bf90a1d" (UID: "819444ce-2f1f-4970-b22d-80cd3bf90a1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.297003 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c920492a-2fc5-4531-9eff-538a52f5d3de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c920492a-2fc5-4531-9eff-538a52f5d3de" (UID: "c920492a-2fc5-4531-9eff-538a52f5d3de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.297574 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630a3885-5144-4f12-9488-b51346e29dee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "630a3885-5144-4f12-9488-b51346e29dee" (UID: "630a3885-5144-4f12-9488-b51346e29dee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.297911 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f679b2-054e-4fd3-9b93-276ffd1a45b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19f679b2-054e-4fd3-9b93-276ffd1a45b6" (UID: "19f679b2-054e-4fd3-9b93-276ffd1a45b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.300214 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630a3885-5144-4f12-9488-b51346e29dee-kube-api-access-stmlg" (OuterVolumeSpecName: "kube-api-access-stmlg") pod "630a3885-5144-4f12-9488-b51346e29dee" (UID: "630a3885-5144-4f12-9488-b51346e29dee"). InnerVolumeSpecName "kube-api-access-stmlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.300256 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c920492a-2fc5-4531-9eff-538a52f5d3de-kube-api-access-l5q85" (OuterVolumeSpecName: "kube-api-access-l5q85") pod "c920492a-2fc5-4531-9eff-538a52f5d3de" (UID: "c920492a-2fc5-4531-9eff-538a52f5d3de"). InnerVolumeSpecName "kube-api-access-l5q85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.301519 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f679b2-054e-4fd3-9b93-276ffd1a45b6-kube-api-access-r2bc4" (OuterVolumeSpecName: "kube-api-access-r2bc4") pod "19f679b2-054e-4fd3-9b93-276ffd1a45b6" (UID: "19f679b2-054e-4fd3-9b93-276ffd1a45b6"). InnerVolumeSpecName "kube-api-access-r2bc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.305278 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819444ce-2f1f-4970-b22d-80cd3bf90a1d-kube-api-access-nqgsh" (OuterVolumeSpecName: "kube-api-access-nqgsh") pod "819444ce-2f1f-4970-b22d-80cd3bf90a1d" (UID: "819444ce-2f1f-4970-b22d-80cd3bf90a1d"). InnerVolumeSpecName "kube-api-access-nqgsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398275 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c920492a-2fc5-4531-9eff-538a52f5d3de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398313 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stmlg\" (UniqueName: \"kubernetes.io/projected/630a3885-5144-4f12-9488-b51346e29dee-kube-api-access-stmlg\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398329 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/819444ce-2f1f-4970-b22d-80cd3bf90a1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398341 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19f679b2-054e-4fd3-9b93-276ffd1a45b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398355 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5q85\" (UniqueName: \"kubernetes.io/projected/c920492a-2fc5-4531-9eff-538a52f5d3de-kube-api-access-l5q85\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398367 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqgsh\" (UniqueName: \"kubernetes.io/projected/819444ce-2f1f-4970-b22d-80cd3bf90a1d-kube-api-access-nqgsh\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398398 4799 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/630a3885-5144-4f12-9488-b51346e29dee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.398410 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2bc4\" (UniqueName: \"kubernetes.io/projected/19f679b2-054e-4fd3-9b93-276ffd1a45b6-kube-api-access-r2bc4\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.588544 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a4455494-ef4e-4f95-87d6-cd495059bb9a","Type":"ContainerStarted","Data":"ab068aa91132fe249f753d6c17eabded441a8b43dfc915b5ef3edb093b63db8b"} Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.589703 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lccb6" event={"ID":"c920492a-2fc5-4531-9eff-538a52f5d3de","Type":"ContainerDied","Data":"6a99817ec82af7f97f6d2fdd940059d7edf81635f07ec369fb51a7fa8fc66c94"} Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.589726 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a99817ec82af7f97f6d2fdd940059d7edf81635f07ec369fb51a7fa8fc66c94" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.589780 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lccb6" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.608491 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" event={"ID":"630a3885-5144-4f12-9488-b51346e29dee","Type":"ContainerDied","Data":"2da5be8d00470d02de914910820f68a646b8419ad876d6d3e348f36fb761db31"} Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.608527 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da5be8d00470d02de914910820f68a646b8419ad876d6d3e348f36fb761db31" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.608577 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e217-account-create-update-ww4cp" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.613102 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" event={"ID":"19f679b2-054e-4fd3-9b93-276ffd1a45b6","Type":"ContainerDied","Data":"e8b7316a785087e74a5150c8755f5c1402dcb46a9e25b98fcb2fc4f4fba4200e"} Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.613158 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b7316a785087e74a5150c8755f5c1402dcb46a9e25b98fcb2fc4f4fba4200e" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.613216 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-de1e-account-create-update-t6tzl" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.628952 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8skh8" event={"ID":"819444ce-2f1f-4970-b22d-80cd3bf90a1d","Type":"ContainerDied","Data":"82870a1ca7447c7245b6bab5e23b44bf1d60417482028dfc737150e89f050cd0"} Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.628986 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82870a1ca7447c7245b6bab5e23b44bf1d60417482028dfc737150e89f050cd0" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.629044 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8skh8" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.642069 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e7c2-account-create-update-szzhc" event={"ID":"a750507b-f4c5-4327-ad80-5b20b8740bef","Type":"ContainerDied","Data":"acec738dab35ffd64311d9be78abd168562593e47b895394daa37ef84ac4b1aa"} Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.642111 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acec738dab35ffd64311d9be78abd168562593e47b895394daa37ef84ac4b1aa" Mar 19 20:24:55 crc kubenswrapper[4799]: I0319 20:24:55.642167 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e7c2-account-create-update-szzhc" Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.035411 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77768f5c85-6lgxw" Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.138475 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-699d85bbdd-vfn2t"] Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.138692 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-699d85bbdd-vfn2t" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-api" containerID="cri-o://9a11f9f5a9fa98125b8a45799e58a65178cf6d659d9caac7412dd20c13cc0a97" gracePeriod=30 Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.139055 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-699d85bbdd-vfn2t" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-httpd" containerID="cri-o://20c0185206eb3196343f35f5c5920f7734a7afa5c139e36eb4292590b3552508" gracePeriod=30 Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.676294 4799 generic.go:334] "Generic (PLEG): container finished" podID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerID="20c0185206eb3196343f35f5c5920f7734a7afa5c139e36eb4292590b3552508" exitCode=0 Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.676685 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699d85bbdd-vfn2t" event={"ID":"794e2fbb-5a64-4bc9-b25a-9041256b23ea","Type":"ContainerDied","Data":"20c0185206eb3196343f35f5c5920f7734a7afa5c139e36eb4292590b3552508"} Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.678494 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a4455494-ef4e-4f95-87d6-cd495059bb9a","Type":"ContainerStarted","Data":"c59ebb956d6877627e0b8cfb25e61508bf16e77e8b2f1edc23fe5d45494906d3"} Mar 19 20:24:56 crc kubenswrapper[4799]: I0319 20:24:56.703865 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.703847261 podStartE2EDuration="3.703847261s" podCreationTimestamp="2026-03-19 20:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:24:56.700713015 +0000 UTC m=+1174.306666087" watchObservedRunningTime="2026-03-19 20:24:56.703847261 +0000 UTC m=+1174.309800323" Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.536658 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.537612 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-httpd" containerID="cri-o://75acd7be3a759b0800e888157f520500d7460087d231ed76271305b389800abf" gracePeriod=30 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.537809 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-log" containerID="cri-o://71c0fca97b439f69dcf9e6636cbdf117b255ba609995991f274e83aeb2c785ef" gracePeriod=30 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.687441 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerStarted","Data":"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e"} Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.687634 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-central-agent" containerID="cri-o://5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" gracePeriod=30 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.687719 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.688192 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="proxy-httpd" containerID="cri-o://4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" gracePeriod=30 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.688294 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="sg-core" containerID="cri-o://ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" gracePeriod=30 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.688348 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-notification-agent" containerID="cri-o://29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" gracePeriod=30 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.695428 4799 generic.go:334] "Generic (PLEG): container finished" podID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerID="71c0fca97b439f69dcf9e6636cbdf117b255ba609995991f274e83aeb2c785ef" exitCode=143 Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.695471 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46527d99-7ba4-4e4e-baf3-77be33ab0460","Type":"ContainerDied","Data":"71c0fca97b439f69dcf9e6636cbdf117b255ba609995991f274e83aeb2c785ef"} Mar 19 20:24:57 crc kubenswrapper[4799]: I0319 20:24:57.721186 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.522687885 podStartE2EDuration="7.721164842s" podCreationTimestamp="2026-03-19 20:24:50 +0000 UTC" firstStartedPulling="2026-03-19 20:24:51.326146531 +0000 UTC m=+1168.932099603" lastFinishedPulling="2026-03-19 20:24:56.524623488 +0000 UTC m=+1174.130576560" observedRunningTime="2026-03-19 20:24:57.713279867 +0000 UTC m=+1175.319232949" watchObservedRunningTime="2026-03-19 20:24:57.721164842 +0000 UTC m=+1175.327117914" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.400253 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.502833 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-log-httpd\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.502914 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-run-httpd\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.502954 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-config-data\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.503007 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-combined-ca-bundle\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.503043 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rsmk\" (UniqueName: \"kubernetes.io/projected/77915c6c-9c63-4784-a290-7c25b984d08b-kube-api-access-7rsmk\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.503125 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-sg-core-conf-yaml\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.503195 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-scripts\") pod \"77915c6c-9c63-4784-a290-7c25b984d08b\" (UID: \"77915c6c-9c63-4784-a290-7c25b984d08b\") " Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.511220 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-scripts" (OuterVolumeSpecName: "scripts") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.511533 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.511720 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77915c6c-9c63-4784-a290-7c25b984d08b-kube-api-access-7rsmk" (OuterVolumeSpecName: "kube-api-access-7rsmk") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "kube-api-access-7rsmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.512218 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.552233 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.623802 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.623837 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77915c6c-9c63-4784-a290-7c25b984d08b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.623849 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rsmk\" (UniqueName: \"kubernetes.io/projected/77915c6c-9c63-4784-a290-7c25b984d08b-kube-api-access-7rsmk\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.623862 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.623873 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.627895 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.695575 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-config-data" (OuterVolumeSpecName: "config-data") pod "77915c6c-9c63-4784-a290-7c25b984d08b" (UID: "77915c6c-9c63-4784-a290-7c25b984d08b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.744283 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.744317 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77915c6c-9c63-4784-a290-7c25b984d08b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780728 4799 generic.go:334] "Generic (PLEG): container finished" podID="77915c6c-9c63-4784-a290-7c25b984d08b" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" exitCode=0 Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780759 4799 generic.go:334] "Generic (PLEG): container finished" podID="77915c6c-9c63-4784-a290-7c25b984d08b" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" exitCode=2 Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780773 4799 generic.go:334] "Generic (PLEG): container finished" podID="77915c6c-9c63-4784-a290-7c25b984d08b" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" exitCode=0 Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780780 4799 generic.go:334] "Generic (PLEG): container finished" podID="77915c6c-9c63-4784-a290-7c25b984d08b" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" exitCode=0 Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780797 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerDied","Data":"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e"} Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780823 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerDied","Data":"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae"} Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780832 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerDied","Data":"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6"} Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780841 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerDied","Data":"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f"} Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780849 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77915c6c-9c63-4784-a290-7c25b984d08b","Type":"ContainerDied","Data":"687232ec8bca99aa08b91e092a54848a68f8578f45b462923c1ae71d64d21c8e"} Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780864 4799 scope.go:117] "RemoveContainer" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.780984 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.818844 4799 scope.go:117] "RemoveContainer" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.826324 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.834319 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.839446 4799 scope.go:117] "RemoveContainer" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850312 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850719 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630a3885-5144-4f12-9488-b51346e29dee" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850741 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="630a3885-5144-4f12-9488-b51346e29dee" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850754 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a750507b-f4c5-4327-ad80-5b20b8740bef" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850764 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a750507b-f4c5-4327-ad80-5b20b8740bef" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850782 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb48c3f-ff08-41c9-b3b9-6d974dc85797" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850793 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb48c3f-ff08-41c9-b3b9-6d974dc85797" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850803 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f679b2-054e-4fd3-9b93-276ffd1a45b6" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850810 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f679b2-054e-4fd3-9b93-276ffd1a45b6" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850830 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="proxy-httpd" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850840 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="proxy-httpd" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850859 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-central-agent" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850866 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-central-agent" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850886 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="sg-core" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850894 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="sg-core" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850906 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c920492a-2fc5-4531-9eff-538a52f5d3de" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850914 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c920492a-2fc5-4531-9eff-538a52f5d3de" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850931 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-notification-agent" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850938 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-notification-agent" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.850948 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819444ce-2f1f-4970-b22d-80cd3bf90a1d" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.850955 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="819444ce-2f1f-4970-b22d-80cd3bf90a1d" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851164 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb48c3f-ff08-41c9-b3b9-6d974dc85797" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851178 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="630a3885-5144-4f12-9488-b51346e29dee" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851190 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c920492a-2fc5-4531-9eff-538a52f5d3de" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851205 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="proxy-httpd" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851214 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f679b2-054e-4fd3-9b93-276ffd1a45b6" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851228 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a750507b-f4c5-4327-ad80-5b20b8740bef" containerName="mariadb-account-create-update" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851241 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="819444ce-2f1f-4970-b22d-80cd3bf90a1d" containerName="mariadb-database-create" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851251 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-notification-agent" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851263 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="sg-core" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.851276 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" containerName="ceilometer-central-agent" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.854494 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.856667 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.856691 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.865293 4799 scope.go:117] "RemoveContainer" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.872333 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.899002 4799 scope.go:117] "RemoveContainer" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.902741 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": container with ID starting with 4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e not found: ID does not exist" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.902779 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e"} err="failed to get container status \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": rpc error: code = NotFound desc = could not find container \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": container with ID starting with 4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.902804 4799 scope.go:117] "RemoveContainer" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.903146 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": container with ID starting with ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae not found: ID does not exist" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.903170 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae"} err="failed to get container status \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": rpc error: code = NotFound desc = could not find container \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": container with ID starting with ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.903185 4799 scope.go:117] "RemoveContainer" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.903445 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": container with ID starting with 29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6 not found: ID does not exist" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.903465 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6"} err="failed to get container status \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": rpc error: code = NotFound desc = could not find container \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": container with ID starting with 29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6 not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.903476 4799 scope.go:117] "RemoveContainer" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" Mar 19 20:24:58 crc kubenswrapper[4799]: E0319 20:24:58.903809 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": container with ID starting with 5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f not found: ID does not exist" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.903830 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f"} err="failed to get container status \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": rpc error: code = NotFound desc = could not find container \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": container with ID starting with 5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.903852 4799 scope.go:117] "RemoveContainer" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904088 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e"} err="failed to get container status \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": rpc error: code = NotFound desc = could not find container \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": container with ID starting with 4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904109 4799 scope.go:117] "RemoveContainer" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904279 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae"} err="failed to get container status \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": rpc error: code = NotFound desc = could not find container \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": container with ID starting with ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904299 4799 scope.go:117] "RemoveContainer" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904557 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6"} err="failed to get container status \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": rpc error: code = NotFound desc = could not find container \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": container with ID starting with 29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6 not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904600 4799 scope.go:117] "RemoveContainer" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904798 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f"} err="failed to get container status \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": rpc error: code = NotFound desc = could not find container \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": container with ID starting with 5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.904898 4799 scope.go:117] "RemoveContainer" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905157 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e"} err="failed to get container status \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": rpc error: code = NotFound desc = could not find container \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": container with ID starting with 4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905179 4799 scope.go:117] "RemoveContainer" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905329 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae"} err="failed to get container status \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": rpc error: code = NotFound desc = could not find container \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": container with ID starting with ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905504 4799 scope.go:117] "RemoveContainer" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905716 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6"} err="failed to get container status \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": rpc error: code = NotFound desc = could not find container \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": container with ID starting with 29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6 not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905739 4799 scope.go:117] "RemoveContainer" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905936 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f"} err="failed to get container status \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": rpc error: code = NotFound desc = could not find container \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": container with ID starting with 5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.905958 4799 scope.go:117] "RemoveContainer" containerID="4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906137 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e"} err="failed to get container status \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": rpc error: code = NotFound desc = could not find container \"4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e\": container with ID starting with 4a3d7cf6c68112e6f49fd5ef6f63e4c640cfd3b501121237a96b487f9d2afc0e not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906154 4799 scope.go:117] "RemoveContainer" containerID="ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906480 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae"} err="failed to get container status \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": rpc error: code = NotFound desc = could not find container \"ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae\": container with ID starting with ad67fe21c8542892b0006d0a0f64c7799d79862404b5d210b1327dda2db901ae not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906521 4799 scope.go:117] "RemoveContainer" containerID="29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906754 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6"} err="failed to get container status \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": rpc error: code = NotFound desc = could not find container \"29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6\": container with ID starting with 29a14c5bc831e21116bc31addb3b54c6159b9e258cdcd8126937b2af176851b6 not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906775 4799 scope.go:117] "RemoveContainer" containerID="5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.906929 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f"} err="failed to get container status \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": rpc error: code = NotFound desc = could not find container \"5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f\": container with ID starting with 5e37ff07861aefb3a1c6cb4a6295d9b8a46a52ad13cab68a1c44464640cb919f not found: ID does not exist" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.947701 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.947761 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-log-httpd\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.947813 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-scripts\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.947844 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6b9j\" (UniqueName: \"kubernetes.io/projected/b97dd219-b277-41ec-9e5a-b7e94da12dbe-kube-api-access-q6b9j\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.947912 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-run-httpd\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.947928 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-config-data\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:58 crc kubenswrapper[4799]: I0319 20:24:58.948042 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049326 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049483 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-log-httpd\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049525 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-scripts\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049554 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6b9j\" (UniqueName: \"kubernetes.io/projected/b97dd219-b277-41ec-9e5a-b7e94da12dbe-kube-api-access-q6b9j\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049624 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-run-httpd\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049643 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-config-data\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.049672 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.050196 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-run-httpd\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.050346 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-log-httpd\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.054437 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-config-data\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.054696 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.066779 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-scripts\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.072526 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.077158 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6b9j\" (UniqueName: \"kubernetes.io/projected/b97dd219-b277-41ec-9e5a-b7e94da12dbe-kube-api-access-q6b9j\") pod \"ceilometer-0\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.140097 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77915c6c-9c63-4784-a290-7c25b984d08b" path="/var/lib/kubelet/pods/77915c6c-9c63-4784-a290-7c25b984d08b/volumes" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.169299 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.645105 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:24:59 crc kubenswrapper[4799]: I0319 20:24:59.790043 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerStarted","Data":"e955288544c24f8d02daab6657ffb611fbd1d442a31ef96847edee8f5ec2468f"} Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.387695 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484456 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-combined-ca-bundle\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484595 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-scripts\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484623 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf2a634-2499-4ea6-853f-9c8852d65e01-logs\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484646 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-config-data\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484745 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmpjj\" (UniqueName: \"kubernetes.io/projected/ecf2a634-2499-4ea6-853f-9c8852d65e01-kube-api-access-zmpjj\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484785 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-tls-certs\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.484827 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-secret-key\") pod \"ecf2a634-2499-4ea6-853f-9c8852d65e01\" (UID: \"ecf2a634-2499-4ea6-853f-9c8852d65e01\") " Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.485277 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecf2a634-2499-4ea6-853f-9c8852d65e01-logs" (OuterVolumeSpecName: "logs") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.489927 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.490268 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf2a634-2499-4ea6-853f-9c8852d65e01-kube-api-access-zmpjj" (OuterVolumeSpecName: "kube-api-access-zmpjj") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "kube-api-access-zmpjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.507825 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-scripts" (OuterVolumeSpecName: "scripts") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.516002 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.519939 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-config-data" (OuterVolumeSpecName: "config-data") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.536365 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ecf2a634-2499-4ea6-853f-9c8852d65e01" (UID: "ecf2a634-2499-4ea6-853f-9c8852d65e01"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587278 4799 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587317 4799 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587331 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf2a634-2499-4ea6-853f-9c8852d65e01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587342 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587353 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf2a634-2499-4ea6-853f-9c8852d65e01-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587390 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecf2a634-2499-4ea6-853f-9c8852d65e01-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.587403 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmpjj\" (UniqueName: \"kubernetes.io/projected/ecf2a634-2499-4ea6-853f-9c8852d65e01-kube-api-access-zmpjj\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.802165 4799 generic.go:334] "Generic (PLEG): container finished" podID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerID="75acd7be3a759b0800e888157f520500d7460087d231ed76271305b389800abf" exitCode=0 Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.802359 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46527d99-7ba4-4e4e-baf3-77be33ab0460","Type":"ContainerDied","Data":"75acd7be3a759b0800e888157f520500d7460087d231ed76271305b389800abf"} Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.807740 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerStarted","Data":"f1545d30f44aafc0191b95b3648cb6fdab16034324ed8f82ce81fb39b99e1352"} Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.809078 4799 generic.go:334] "Generic (PLEG): container finished" podID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerID="ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4" exitCode=137 Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.809107 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8869c89f8-jvpbt" event={"ID":"ecf2a634-2499-4ea6-853f-9c8852d65e01","Type":"ContainerDied","Data":"ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4"} Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.809123 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8869c89f8-jvpbt" event={"ID":"ecf2a634-2499-4ea6-853f-9c8852d65e01","Type":"ContainerDied","Data":"0d5befac045ea7bc35cc692c6786017987a9ea930b1eb74042dccde4dee53355"} Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.809139 4799 scope.go:117] "RemoveContainer" containerID="132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.809136 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8869c89f8-jvpbt" Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.866070 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8869c89f8-jvpbt"] Mar 19 20:25:00 crc kubenswrapper[4799]: I0319 20:25:00.882553 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8869c89f8-jvpbt"] Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.014377 4799 scope.go:117] "RemoveContainer" containerID="ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.063103 4799 scope.go:117] "RemoveContainer" containerID="132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612" Mar 19 20:25:01 crc kubenswrapper[4799]: E0319 20:25:01.069850 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612\": container with ID starting with 132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612 not found: ID does not exist" containerID="132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.069890 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612"} err="failed to get container status \"132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612\": rpc error: code = NotFound desc = could not find container \"132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612\": container with ID starting with 132864fdc2dfb6e2e83b9caea2974e5f0ef9f71b31a237763985f954ef282612 not found: ID does not exist" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.069914 4799 scope.go:117] "RemoveContainer" containerID="ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4" Mar 19 20:25:01 crc kubenswrapper[4799]: E0319 20:25:01.070258 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4\": container with ID starting with ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4 not found: ID does not exist" containerID="ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.070290 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4"} err="failed to get container status \"ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4\": rpc error: code = NotFound desc = could not find container \"ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4\": container with ID starting with ebf286b98ddcc2fa1e774b06efa9233d10285a353c39de2ec40f02154d33b1e4 not found: ID does not exist" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.156450 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" path="/var/lib/kubelet/pods/ecf2a634-2499-4ea6-853f-9c8852d65e01/volumes" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.236170 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399191 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-scripts\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399255 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399350 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-config-data\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399435 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-logs\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399508 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-combined-ca-bundle\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399543 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-public-tls-certs\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399615 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/46527d99-7ba4-4e4e-baf3-77be33ab0460-kube-api-access-kqnlf\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.399656 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-httpd-run\") pod \"46527d99-7ba4-4e4e-baf3-77be33ab0460\" (UID: \"46527d99-7ba4-4e4e-baf3-77be33ab0460\") " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.401507 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-logs" (OuterVolumeSpecName: "logs") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.401802 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.406343 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.409977 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46527d99-7ba4-4e4e-baf3-77be33ab0460-kube-api-access-kqnlf" (OuterVolumeSpecName: "kube-api-access-kqnlf") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "kube-api-access-kqnlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.412375 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-scripts" (OuterVolumeSpecName: "scripts") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.452202 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.456959 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bkvvs"] Mar 19 20:25:01 crc kubenswrapper[4799]: E0319 20:25:01.457368 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457401 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" Mar 19 20:25:01 crc kubenswrapper[4799]: E0319 20:25:01.457414 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon-log" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457421 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon-log" Mar 19 20:25:01 crc kubenswrapper[4799]: E0319 20:25:01.457441 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-log" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457449 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-log" Mar 19 20:25:01 crc kubenswrapper[4799]: E0319 20:25:01.457458 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-httpd" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457464 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-httpd" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457644 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-log" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457657 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" containerName="glance-httpd" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.457664 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.458446 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf2a634-2499-4ea6-853f-9c8852d65e01" containerName="horizon-log" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.459002 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.470840 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.471017 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.471127 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9pjf8" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.478930 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bkvvs"] Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.503559 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-config-data" (OuterVolumeSpecName: "config-data") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505738 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505765 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505775 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505784 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/46527d99-7ba4-4e4e-baf3-77be33ab0460-kube-api-access-kqnlf\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505792 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46527d99-7ba4-4e4e-baf3-77be33ab0460-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505800 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.505831 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.509499 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "46527d99-7ba4-4e4e-baf3-77be33ab0460" (UID: "46527d99-7ba4-4e4e-baf3-77be33ab0460"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.552001 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.607316 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-scripts\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.607523 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.607592 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-config-data\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.607796 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnwck\" (UniqueName: \"kubernetes.io/projected/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-kube-api-access-nnwck\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.607899 4799 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46527d99-7ba4-4e4e-baf3-77be33ab0460-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.607921 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.709657 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-scripts\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.710043 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.710075 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-config-data\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.710127 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnwck\" (UniqueName: \"kubernetes.io/projected/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-kube-api-access-nnwck\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.713545 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-scripts\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.723405 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-config-data\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.724089 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.731809 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnwck\" (UniqueName: \"kubernetes.io/projected/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-kube-api-access-nnwck\") pod \"nova-cell0-conductor-db-sync-bkvvs\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.791138 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.830889 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"46527d99-7ba4-4e4e-baf3-77be33ab0460","Type":"ContainerDied","Data":"908ce2130866427936e02644c2930d27d61dfc7add84d359df6d62f650a394e6"} Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.830947 4799 scope.go:117] "RemoveContainer" containerID="75acd7be3a759b0800e888157f520500d7460087d231ed76271305b389800abf" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.831068 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.838735 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerStarted","Data":"187e77111eafcfca55036cfff881b5ea30ca8585c9e34698bbd41f624c7d305c"} Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.924510 4799 scope.go:117] "RemoveContainer" containerID="71c0fca97b439f69dcf9e6636cbdf117b255ba609995991f274e83aeb2c785ef" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.929483 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.947291 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.967699 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.973073 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.977576 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 20:25:01 crc kubenswrapper[4799]: I0319 20:25:01.977760 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.006425 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119125 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119235 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec6464f3-4c36-4387-8127-56a300a1d79c-logs\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119303 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119364 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec6464f3-4c36-4387-8127-56a300a1d79c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119401 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119492 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmps2\" (UniqueName: \"kubernetes.io/projected/ec6464f3-4c36-4387-8127-56a300a1d79c-kube-api-access-kmps2\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119526 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.119658 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.220921 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec6464f3-4c36-4387-8127-56a300a1d79c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.220966 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.221028 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmps2\" (UniqueName: \"kubernetes.io/projected/ec6464f3-4c36-4387-8127-56a300a1d79c-kube-api-access-kmps2\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.221198 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.221889 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.221946 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.221996 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec6464f3-4c36-4387-8127-56a300a1d79c-logs\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.221528 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec6464f3-4c36-4387-8127-56a300a1d79c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.222097 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.222439 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.222488 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec6464f3-4c36-4387-8127-56a300a1d79c-logs\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.225634 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.225672 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.228613 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.232033 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6464f3-4c36-4387-8127-56a300a1d79c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.236876 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmps2\" (UniqueName: \"kubernetes.io/projected/ec6464f3-4c36-4387-8127-56a300a1d79c-kube-api-access-kmps2\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.253908 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"ec6464f3-4c36-4387-8127-56a300a1d79c\") " pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.298608 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.335044 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bkvvs"] Mar 19 20:25:02 crc kubenswrapper[4799]: W0319 20:25:02.350999 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f9a9bf0_25f2_4716_bb99_8374f07ec1ff.slice/crio-a269f8ab252be6b52a304b020556b5d1d16e801b16cbd136f8f1945a7a0f286b WatchSource:0}: Error finding container a269f8ab252be6b52a304b020556b5d1d16e801b16cbd136f8f1945a7a0f286b: Status 404 returned error can't find the container with id a269f8ab252be6b52a304b020556b5d1d16e801b16cbd136f8f1945a7a0f286b Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.820789 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.848688 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec6464f3-4c36-4387-8127-56a300a1d79c","Type":"ContainerStarted","Data":"0d625be96dbe1059824bb2b1bcd939ce78ff2bf8e13f7191c9434db102327724"} Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.851246 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerStarted","Data":"377840c5e622fed6d9d0a6bdc5283f9b6d450c6e07bd1621e87241c2c8b4c69b"} Mar 19 20:25:02 crc kubenswrapper[4799]: I0319 20:25:02.852321 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" event={"ID":"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff","Type":"ContainerStarted","Data":"a269f8ab252be6b52a304b020556b5d1d16e801b16cbd136f8f1945a7a0f286b"} Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.137051 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46527d99-7ba4-4e4e-baf3-77be33ab0460" path="/var/lib/kubelet/pods/46527d99-7ba4-4e4e-baf3-77be33ab0460/volumes" Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.599208 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.599528 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.636953 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.664993 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.921491 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec6464f3-4c36-4387-8127-56a300a1d79c","Type":"ContainerStarted","Data":"bcbeb7f8ea65fd54f051d7632ca2a2d93001e189e6da0779d0c6cc57e537fc10"} Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.921719 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:03 crc kubenswrapper[4799]: I0319 20:25:03.921730 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:04 crc kubenswrapper[4799]: I0319 20:25:04.903074 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-686489978d-5lwnf" Mar 19 20:25:04 crc kubenswrapper[4799]: I0319 20:25:04.941443 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec6464f3-4c36-4387-8127-56a300a1d79c","Type":"ContainerStarted","Data":"cf107c121f32049ba273300d8e2d2f4b0c3283ff87dcc8fdbbbb4e93141a7d26"} Mar 19 20:25:04 crc kubenswrapper[4799]: I0319 20:25:04.964656 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.964635736 podStartE2EDuration="3.964635736s" podCreationTimestamp="2026-03-19 20:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:04.961608473 +0000 UTC m=+1182.567561545" watchObservedRunningTime="2026-03-19 20:25:04.964635736 +0000 UTC m=+1182.570588808" Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.386869 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-686489978d-5lwnf" Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.456207 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-787b8d7874-lck4d"] Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.456501 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-787b8d7874-lck4d" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-log" containerID="cri-o://acfebf18501c3bc5158d9fb064ae6dd5c280143954fe019afa99be84553b7cbd" gracePeriod=30 Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.456926 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-787b8d7874-lck4d" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-api" containerID="cri-o://fe50d77a9790ca16fa168bd2cd4d26f4bc0a1815e39c3aeb520bac03f38be31e" gracePeriod=30 Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.952297 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerStarted","Data":"4352d10d45f210e71e2c1703b6497570d1a8e81a60648ce275f6e57aff14073c"} Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.952730 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.956154 4799 generic.go:334] "Generic (PLEG): container finished" podID="810c2730-0702-4ff9-b62e-74c6bc564149" containerID="acfebf18501c3bc5158d9fb064ae6dd5c280143954fe019afa99be84553b7cbd" exitCode=143 Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.956210 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-787b8d7874-lck4d" event={"ID":"810c2730-0702-4ff9-b62e-74c6bc564149","Type":"ContainerDied","Data":"acfebf18501c3bc5158d9fb064ae6dd5c280143954fe019afa99be84553b7cbd"} Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.956241 4799 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.956250 4799 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 20:25:05 crc kubenswrapper[4799]: I0319 20:25:05.974479 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.768811919 podStartE2EDuration="7.974462632s" podCreationTimestamp="2026-03-19 20:24:58 +0000 UTC" firstStartedPulling="2026-03-19 20:24:59.649562369 +0000 UTC m=+1177.255515441" lastFinishedPulling="2026-03-19 20:25:04.855213082 +0000 UTC m=+1182.461166154" observedRunningTime="2026-03-19 20:25:05.96780513 +0000 UTC m=+1183.573758202" watchObservedRunningTime="2026-03-19 20:25:05.974462632 +0000 UTC m=+1183.580415704" Mar 19 20:25:06 crc kubenswrapper[4799]: I0319 20:25:06.472429 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:06 crc kubenswrapper[4799]: I0319 20:25:06.577965 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 20:25:06 crc kubenswrapper[4799]: I0319 20:25:06.847988 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:07 crc kubenswrapper[4799]: I0319 20:25:07.980619 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-central-agent" containerID="cri-o://f1545d30f44aafc0191b95b3648cb6fdab16034324ed8f82ce81fb39b99e1352" gracePeriod=30 Mar 19 20:25:07 crc kubenswrapper[4799]: I0319 20:25:07.980678 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="proxy-httpd" containerID="cri-o://4352d10d45f210e71e2c1703b6497570d1a8e81a60648ce275f6e57aff14073c" gracePeriod=30 Mar 19 20:25:07 crc kubenswrapper[4799]: I0319 20:25:07.980729 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-notification-agent" containerID="cri-o://187e77111eafcfca55036cfff881b5ea30ca8585c9e34698bbd41f624c7d305c" gracePeriod=30 Mar 19 20:25:07 crc kubenswrapper[4799]: I0319 20:25:07.980692 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="sg-core" containerID="cri-o://377840c5e622fed6d9d0a6bdc5283f9b6d450c6e07bd1621e87241c2c8b4c69b" gracePeriod=30 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.990216 4799 generic.go:334] "Generic (PLEG): container finished" podID="810c2730-0702-4ff9-b62e-74c6bc564149" containerID="fe50d77a9790ca16fa168bd2cd4d26f4bc0a1815e39c3aeb520bac03f38be31e" exitCode=0 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.990305 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-787b8d7874-lck4d" event={"ID":"810c2730-0702-4ff9-b62e-74c6bc564149","Type":"ContainerDied","Data":"fe50d77a9790ca16fa168bd2cd4d26f4bc0a1815e39c3aeb520bac03f38be31e"} Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.992916 4799 generic.go:334] "Generic (PLEG): container finished" podID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerID="9a11f9f5a9fa98125b8a45799e58a65178cf6d659d9caac7412dd20c13cc0a97" exitCode=0 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.992984 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699d85bbdd-vfn2t" event={"ID":"794e2fbb-5a64-4bc9-b25a-9041256b23ea","Type":"ContainerDied","Data":"9a11f9f5a9fa98125b8a45799e58a65178cf6d659d9caac7412dd20c13cc0a97"} Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995674 4799 generic.go:334] "Generic (PLEG): container finished" podID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerID="4352d10d45f210e71e2c1703b6497570d1a8e81a60648ce275f6e57aff14073c" exitCode=0 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995716 4799 generic.go:334] "Generic (PLEG): container finished" podID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerID="377840c5e622fed6d9d0a6bdc5283f9b6d450c6e07bd1621e87241c2c8b4c69b" exitCode=2 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995724 4799 generic.go:334] "Generic (PLEG): container finished" podID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerID="187e77111eafcfca55036cfff881b5ea30ca8585c9e34698bbd41f624c7d305c" exitCode=0 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995732 4799 generic.go:334] "Generic (PLEG): container finished" podID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerID="f1545d30f44aafc0191b95b3648cb6fdab16034324ed8f82ce81fb39b99e1352" exitCode=0 Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995761 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerDied","Data":"4352d10d45f210e71e2c1703b6497570d1a8e81a60648ce275f6e57aff14073c"} Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995797 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerDied","Data":"377840c5e622fed6d9d0a6bdc5283f9b6d450c6e07bd1621e87241c2c8b4c69b"} Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995808 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerDied","Data":"187e77111eafcfca55036cfff881b5ea30ca8585c9e34698bbd41f624c7d305c"} Mar 19 20:25:08 crc kubenswrapper[4799]: I0319 20:25:08.995819 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerDied","Data":"f1545d30f44aafc0191b95b3648cb6fdab16034324ed8f82ce81fb39b99e1352"} Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.842910 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.887863 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926007 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-scripts\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926086 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-run-httpd\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926110 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-log-httpd\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926131 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-public-tls-certs\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926178 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-sg-core-conf-yaml\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926219 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-internal-tls-certs\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926236 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6b9j\" (UniqueName: \"kubernetes.io/projected/b97dd219-b277-41ec-9e5a-b7e94da12dbe-kube-api-access-q6b9j\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926297 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-scripts\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926334 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-combined-ca-bundle\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926359 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810c2730-0702-4ff9-b62e-74c6bc564149-logs\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926376 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-config-data\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926486 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-config-data\") pod \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\" (UID: \"b97dd219-b277-41ec-9e5a-b7e94da12dbe\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926504 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrjrv\" (UniqueName: \"kubernetes.io/projected/810c2730-0702-4ff9-b62e-74c6bc564149-kube-api-access-xrjrv\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.926551 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-combined-ca-bundle\") pod \"810c2730-0702-4ff9-b62e-74c6bc564149\" (UID: \"810c2730-0702-4ff9-b62e-74c6bc564149\") " Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.927193 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/810c2730-0702-4ff9-b62e-74c6bc564149-logs" (OuterVolumeSpecName: "logs") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.929868 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.934219 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97dd219-b277-41ec-9e5a-b7e94da12dbe-kube-api-access-q6b9j" (OuterVolumeSpecName: "kube-api-access-q6b9j") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "kube-api-access-q6b9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.937438 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.949771 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810c2730-0702-4ff9-b62e-74c6bc564149-kube-api-access-xrjrv" (OuterVolumeSpecName: "kube-api-access-xrjrv") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "kube-api-access-xrjrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.949873 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-scripts" (OuterVolumeSpecName: "scripts") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.969971 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-scripts" (OuterVolumeSpecName: "scripts") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.989729 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:11 crc kubenswrapper[4799]: I0319 20:25:11.990932 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.019244 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-config-data" (OuterVolumeSpecName: "config-data") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028227 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028252 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b97dd219-b277-41ec-9e5a-b7e94da12dbe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028261 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028272 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6b9j\" (UniqueName: \"kubernetes.io/projected/b97dd219-b277-41ec-9e5a-b7e94da12dbe-kube-api-access-q6b9j\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028280 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028289 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/810c2730-0702-4ff9-b62e-74c6bc564149-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028298 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028306 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrjrv\" (UniqueName: \"kubernetes.io/projected/810c2730-0702-4ff9-b62e-74c6bc564149-kube-api-access-xrjrv\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028314 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.028321 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.031105 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b97dd219-b277-41ec-9e5a-b7e94da12dbe","Type":"ContainerDied","Data":"e955288544c24f8d02daab6657ffb611fbd1d442a31ef96847edee8f5ec2468f"} Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.031162 4799 scope.go:117] "RemoveContainer" containerID="4352d10d45f210e71e2c1703b6497570d1a8e81a60648ce275f6e57aff14073c" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.031294 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.040469 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" event={"ID":"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff","Type":"ContainerStarted","Data":"a893e648474e84340cee07704a112602321168fea4d163230837a5e2da3bf0ef"} Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.042848 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-787b8d7874-lck4d" event={"ID":"810c2730-0702-4ff9-b62e-74c6bc564149","Type":"ContainerDied","Data":"e1e19e8bc5fd6abf2f10ba72282b61e1fb452954fc8c1cf6a47b43995ce952ca"} Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.042916 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-787b8d7874-lck4d" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.045152 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.056859 4799 scope.go:117] "RemoveContainer" containerID="377840c5e622fed6d9d0a6bdc5283f9b6d450c6e07bd1621e87241c2c8b4c69b" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.060233 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.063071 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" podStartSLOduration=1.8522914369999999 podStartE2EDuration="11.06304381s" podCreationTimestamp="2026-03-19 20:25:01 +0000 UTC" firstStartedPulling="2026-03-19 20:25:02.355170287 +0000 UTC m=+1179.961123349" lastFinishedPulling="2026-03-19 20:25:11.56592265 +0000 UTC m=+1189.171875722" observedRunningTime="2026-03-19 20:25:12.052518232 +0000 UTC m=+1189.658471304" watchObservedRunningTime="2026-03-19 20:25:12.06304381 +0000 UTC m=+1189.668996882" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.073109 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "810c2730-0702-4ff9-b62e-74c6bc564149" (UID: "810c2730-0702-4ff9-b62e-74c6bc564149"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.075201 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.076952 4799 scope.go:117] "RemoveContainer" containerID="187e77111eafcfca55036cfff881b5ea30ca8585c9e34698bbd41f624c7d305c" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.100046 4799 scope.go:117] "RemoveContainer" containerID="f1545d30f44aafc0191b95b3648cb6fdab16034324ed8f82ce81fb39b99e1352" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.101682 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-config-data" (OuterVolumeSpecName: "config-data") pod "b97dd219-b277-41ec-9e5a-b7e94da12dbe" (UID: "b97dd219-b277-41ec-9e5a-b7e94da12dbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.120602 4799 scope.go:117] "RemoveContainer" containerID="fe50d77a9790ca16fa168bd2cd4d26f4bc0a1815e39c3aeb520bac03f38be31e" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.128965 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-ovndb-tls-certs\") pod \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129024 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-config\") pod \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129052 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-httpd-config\") pod \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129096 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kffkm\" (UniqueName: \"kubernetes.io/projected/794e2fbb-5a64-4bc9-b25a-9041256b23ea-kube-api-access-kffkm\") pod \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129124 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-combined-ca-bundle\") pod \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\" (UID: \"794e2fbb-5a64-4bc9-b25a-9041256b23ea\") " Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129579 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129593 4799 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129603 4799 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/810c2730-0702-4ff9-b62e-74c6bc564149-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.129611 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97dd219-b277-41ec-9e5a-b7e94da12dbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.135028 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794e2fbb-5a64-4bc9-b25a-9041256b23ea-kube-api-access-kffkm" (OuterVolumeSpecName: "kube-api-access-kffkm") pod "794e2fbb-5a64-4bc9-b25a-9041256b23ea" (UID: "794e2fbb-5a64-4bc9-b25a-9041256b23ea"). InnerVolumeSpecName "kube-api-access-kffkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.144898 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "794e2fbb-5a64-4bc9-b25a-9041256b23ea" (UID: "794e2fbb-5a64-4bc9-b25a-9041256b23ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.147308 4799 scope.go:117] "RemoveContainer" containerID="acfebf18501c3bc5158d9fb064ae6dd5c280143954fe019afa99be84553b7cbd" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.192097 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-config" (OuterVolumeSpecName: "config") pod "794e2fbb-5a64-4bc9-b25a-9041256b23ea" (UID: "794e2fbb-5a64-4bc9-b25a-9041256b23ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.203778 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794e2fbb-5a64-4bc9-b25a-9041256b23ea" (UID: "794e2fbb-5a64-4bc9-b25a-9041256b23ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.214573 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "794e2fbb-5a64-4bc9-b25a-9041256b23ea" (UID: "794e2fbb-5a64-4bc9-b25a-9041256b23ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.231949 4799 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.231990 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.232014 4799 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.232031 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kffkm\" (UniqueName: \"kubernetes.io/projected/794e2fbb-5a64-4bc9-b25a-9041256b23ea-kube-api-access-kffkm\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.232051 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794e2fbb-5a64-4bc9-b25a-9041256b23ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.299679 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.300934 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.349233 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.355417 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.400183 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.412256 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429078 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429525 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-notification-agent" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429539 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-notification-agent" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429552 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-httpd" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429558 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-httpd" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429567 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="sg-core" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429573 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="sg-core" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429598 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-api" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429604 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-api" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429620 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="proxy-httpd" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429626 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="proxy-httpd" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429637 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-api" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429643 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-api" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429657 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-log" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429663 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-log" Mar 19 20:25:12 crc kubenswrapper[4799]: E0319 20:25:12.429674 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-central-agent" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429680 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-central-agent" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429869 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="sg-core" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429885 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-central-agent" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429893 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-httpd" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429907 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-api" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429920 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="proxy-httpd" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429927 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" containerName="ceilometer-notification-agent" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429933 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" containerName="neutron-api" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.429944 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" containerName="placement-log" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.431574 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.433507 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.434108 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.464163 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-787b8d7874-lck4d"] Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.473597 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-787b8d7874-lck4d"] Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.519459 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.537809 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.537901 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-scripts\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.538530 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-log-httpd\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.538575 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8944\" (UniqueName: \"kubernetes.io/projected/ecfc744b-9bf8-443a-8078-83505c831553-kube-api-access-l8944\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.538648 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.538671 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-config-data\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.538859 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-run-httpd\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.640755 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.640822 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-scripts\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.640897 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-log-httpd\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.640935 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8944\" (UniqueName: \"kubernetes.io/projected/ecfc744b-9bf8-443a-8078-83505c831553-kube-api-access-l8944\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.640979 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.640999 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-config-data\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.641032 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-run-httpd\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.641609 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-log-httpd\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.641633 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-run-httpd\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.645102 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-scripts\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.645944 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.646050 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.654489 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-config-data\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.659325 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8944\" (UniqueName: \"kubernetes.io/projected/ecfc744b-9bf8-443a-8078-83505c831553-kube-api-access-l8944\") pod \"ceilometer-0\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " pod="openstack/ceilometer-0" Mar 19 20:25:12 crc kubenswrapper[4799]: I0319 20:25:12.863258 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.055320 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699d85bbdd-vfn2t" event={"ID":"794e2fbb-5a64-4bc9-b25a-9041256b23ea","Type":"ContainerDied","Data":"4e563b25a31661852ef45e42c086ca84e78ae10096ac6ffce5a3972184de05b6"} Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.056036 4799 scope.go:117] "RemoveContainer" containerID="20c0185206eb3196343f35f5c5920f7734a7afa5c139e36eb4292590b3552508" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.055576 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-699d85bbdd-vfn2t" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.072671 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.072720 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.099201 4799 scope.go:117] "RemoveContainer" containerID="9a11f9f5a9fa98125b8a45799e58a65178cf6d659d9caac7412dd20c13cc0a97" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.102713 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-699d85bbdd-vfn2t"] Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.184403 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810c2730-0702-4ff9-b62e-74c6bc564149" path="/var/lib/kubelet/pods/810c2730-0702-4ff9-b62e-74c6bc564149/volumes" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.186988 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97dd219-b277-41ec-9e5a-b7e94da12dbe" path="/var/lib/kubelet/pods/b97dd219-b277-41ec-9e5a-b7e94da12dbe/volumes" Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.187890 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-699d85bbdd-vfn2t"] Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.324370 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:13 crc kubenswrapper[4799]: I0319 20:25:13.351692 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:14 crc kubenswrapper[4799]: I0319 20:25:14.080875 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerStarted","Data":"ffe9bab1b12fe4f95b8bb43b8fabbe1d0a41df0bea2a781b580c0d8b8f7352d6"} Mar 19 20:25:14 crc kubenswrapper[4799]: I0319 20:25:14.875723 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 20:25:14 crc kubenswrapper[4799]: I0319 20:25:14.878199 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 20:25:15 crc kubenswrapper[4799]: I0319 20:25:15.101906 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerStarted","Data":"9637eb8537ba50ef08bc6a9a3228bafc35a7ca943ab44d52d1ecc90b577e74f6"} Mar 19 20:25:15 crc kubenswrapper[4799]: I0319 20:25:15.101963 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerStarted","Data":"3d4950235c9665f4d26dd0b072682c57b5cf382a2631a8697ae602a7c27d7294"} Mar 19 20:25:15 crc kubenswrapper[4799]: I0319 20:25:15.127452 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794e2fbb-5a64-4bc9-b25a-9041256b23ea" path="/var/lib/kubelet/pods/794e2fbb-5a64-4bc9-b25a-9041256b23ea/volumes" Mar 19 20:25:16 crc kubenswrapper[4799]: I0319 20:25:16.118554 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerStarted","Data":"dbd3cdad3743be91ff25ac071ce35fa4cee772492e8033897037f57b1cf99a43"} Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.135235 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerStarted","Data":"1c3b4e3e099e8ff5d2b834d14ace9d35837eea8571583d3e471fa1213b7ea6b1"} Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.135851 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.135451 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="proxy-httpd" containerID="cri-o://1c3b4e3e099e8ff5d2b834d14ace9d35837eea8571583d3e471fa1213b7ea6b1" gracePeriod=30 Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.135416 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-central-agent" containerID="cri-o://3d4950235c9665f4d26dd0b072682c57b5cf382a2631a8697ae602a7c27d7294" gracePeriod=30 Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.135475 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="sg-core" containerID="cri-o://dbd3cdad3743be91ff25ac071ce35fa4cee772492e8033897037f57b1cf99a43" gracePeriod=30 Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.135486 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-notification-agent" containerID="cri-o://9637eb8537ba50ef08bc6a9a3228bafc35a7ca943ab44d52d1ecc90b577e74f6" gracePeriod=30 Mar 19 20:25:18 crc kubenswrapper[4799]: I0319 20:25:18.175342 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.363906776 podStartE2EDuration="6.175324417s" podCreationTimestamp="2026-03-19 20:25:12 +0000 UTC" firstStartedPulling="2026-03-19 20:25:13.358581763 +0000 UTC m=+1190.964534835" lastFinishedPulling="2026-03-19 20:25:17.169999404 +0000 UTC m=+1194.775952476" observedRunningTime="2026-03-19 20:25:18.170107964 +0000 UTC m=+1195.776061036" watchObservedRunningTime="2026-03-19 20:25:18.175324417 +0000 UTC m=+1195.781277489" Mar 19 20:25:19 crc kubenswrapper[4799]: I0319 20:25:19.145928 4799 generic.go:334] "Generic (PLEG): container finished" podID="ecfc744b-9bf8-443a-8078-83505c831553" containerID="1c3b4e3e099e8ff5d2b834d14ace9d35837eea8571583d3e471fa1213b7ea6b1" exitCode=0 Mar 19 20:25:19 crc kubenswrapper[4799]: I0319 20:25:19.145966 4799 generic.go:334] "Generic (PLEG): container finished" podID="ecfc744b-9bf8-443a-8078-83505c831553" containerID="dbd3cdad3743be91ff25ac071ce35fa4cee772492e8033897037f57b1cf99a43" exitCode=2 Mar 19 20:25:19 crc kubenswrapper[4799]: I0319 20:25:19.145975 4799 generic.go:334] "Generic (PLEG): container finished" podID="ecfc744b-9bf8-443a-8078-83505c831553" containerID="9637eb8537ba50ef08bc6a9a3228bafc35a7ca943ab44d52d1ecc90b577e74f6" exitCode=0 Mar 19 20:25:19 crc kubenswrapper[4799]: I0319 20:25:19.145996 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerDied","Data":"1c3b4e3e099e8ff5d2b834d14ace9d35837eea8571583d3e471fa1213b7ea6b1"} Mar 19 20:25:19 crc kubenswrapper[4799]: I0319 20:25:19.146032 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerDied","Data":"dbd3cdad3743be91ff25ac071ce35fa4cee772492e8033897037f57b1cf99a43"} Mar 19 20:25:19 crc kubenswrapper[4799]: I0319 20:25:19.146046 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerDied","Data":"9637eb8537ba50ef08bc6a9a3228bafc35a7ca943ab44d52d1ecc90b577e74f6"} Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.178647 4799 generic.go:334] "Generic (PLEG): container finished" podID="1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" containerID="a893e648474e84340cee07704a112602321168fea4d163230837a5e2da3bf0ef" exitCode=0 Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.178894 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" event={"ID":"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff","Type":"ContainerDied","Data":"a893e648474e84340cee07704a112602321168fea4d163230837a5e2da3bf0ef"} Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.184094 4799 generic.go:334] "Generic (PLEG): container finished" podID="ecfc744b-9bf8-443a-8078-83505c831553" containerID="3d4950235c9665f4d26dd0b072682c57b5cf382a2631a8697ae602a7c27d7294" exitCode=0 Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.184164 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerDied","Data":"3d4950235c9665f4d26dd0b072682c57b5cf382a2631a8697ae602a7c27d7294"} Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.557748 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.631565 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-run-httpd\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.631953 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.632123 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-log-httpd\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.632569 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-combined-ca-bundle\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.632754 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8944\" (UniqueName: \"kubernetes.io/projected/ecfc744b-9bf8-443a-8078-83505c831553-kube-api-access-l8944\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.632854 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.633061 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-sg-core-conf-yaml\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.633218 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-config-data\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.633481 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-scripts\") pod \"ecfc744b-9bf8-443a-8078-83505c831553\" (UID: \"ecfc744b-9bf8-443a-8078-83505c831553\") " Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.634350 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.634512 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfc744b-9bf8-443a-8078-83505c831553-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.640968 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfc744b-9bf8-443a-8078-83505c831553-kube-api-access-l8944" (OuterVolumeSpecName: "kube-api-access-l8944") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "kube-api-access-l8944". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.644930 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-scripts" (OuterVolumeSpecName: "scripts") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.680629 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.736157 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8944\" (UniqueName: \"kubernetes.io/projected/ecfc744b-9bf8-443a-8078-83505c831553-kube-api-access-l8944\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.736377 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.736415 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.737559 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.777626 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-config-data" (OuterVolumeSpecName: "config-data") pod "ecfc744b-9bf8-443a-8078-83505c831553" (UID: "ecfc744b-9bf8-443a-8078-83505c831553"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.838326 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:22 crc kubenswrapper[4799]: I0319 20:25:22.838354 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfc744b-9bf8-443a-8078-83505c831553-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.198864 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.199561 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfc744b-9bf8-443a-8078-83505c831553","Type":"ContainerDied","Data":"ffe9bab1b12fe4f95b8bb43b8fabbe1d0a41df0bea2a781b580c0d8b8f7352d6"} Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.199613 4799 scope.go:117] "RemoveContainer" containerID="1c3b4e3e099e8ff5d2b834d14ace9d35837eea8571583d3e471fa1213b7ea6b1" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.311882 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.326711 4799 scope.go:117] "RemoveContainer" containerID="dbd3cdad3743be91ff25ac071ce35fa4cee772492e8033897037f57b1cf99a43" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.330174 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.345274 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:23 crc kubenswrapper[4799]: E0319 20:25:23.346360 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-central-agent" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.346456 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-central-agent" Mar 19 20:25:23 crc kubenswrapper[4799]: E0319 20:25:23.346499 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="proxy-httpd" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.346505 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="proxy-httpd" Mar 19 20:25:23 crc kubenswrapper[4799]: E0319 20:25:23.346513 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-notification-agent" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.346519 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-notification-agent" Mar 19 20:25:23 crc kubenswrapper[4799]: E0319 20:25:23.346809 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="sg-core" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.346821 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="sg-core" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.347097 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="proxy-httpd" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.347123 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-notification-agent" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.347131 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="ceilometer-central-agent" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.347140 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfc744b-9bf8-443a-8078-83505c831553" containerName="sg-core" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.348964 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.351730 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.352123 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.355980 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.376647 4799 scope.go:117] "RemoveContainer" containerID="9637eb8537ba50ef08bc6a9a3228bafc35a7ca943ab44d52d1ecc90b577e74f6" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.413219 4799 scope.go:117] "RemoveContainer" containerID="3d4950235c9665f4d26dd0b072682c57b5cf382a2631a8697ae602a7c27d7294" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449168 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449211 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449240 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mthc\" (UniqueName: \"kubernetes.io/projected/577f335a-7717-437f-af13-c8ffa4f8f4ea-kube-api-access-6mthc\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449315 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-run-httpd\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449335 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-scripts\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449360 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-config-data\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.449410 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-log-httpd\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.550784 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-run-httpd\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.550827 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-scripts\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.550860 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-config-data\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.550904 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-log-httpd\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.550963 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.550981 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.551002 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mthc\" (UniqueName: \"kubernetes.io/projected/577f335a-7717-437f-af13-c8ffa4f8f4ea-kube-api-access-6mthc\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.551750 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-run-httpd\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.551781 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-log-httpd\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.559356 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.559863 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.560251 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-config-data\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.561226 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.562055 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-scripts\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.567066 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mthc\" (UniqueName: \"kubernetes.io/projected/577f335a-7717-437f-af13-c8ffa4f8f4ea-kube-api-access-6mthc\") pod \"ceilometer-0\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.652930 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnwck\" (UniqueName: \"kubernetes.io/projected/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-kube-api-access-nnwck\") pod \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.653090 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-combined-ca-bundle\") pod \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.653317 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-config-data\") pod \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.653357 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-scripts\") pod \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\" (UID: \"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff\") " Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.660592 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-kube-api-access-nnwck" (OuterVolumeSpecName: "kube-api-access-nnwck") pod "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" (UID: "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff"). InnerVolumeSpecName "kube-api-access-nnwck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.666870 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-scripts" (OuterVolumeSpecName: "scripts") pod "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" (UID: "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.682707 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.687467 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" (UID: "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.703699 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-config-data" (OuterVolumeSpecName: "config-data") pod "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" (UID: "1f9a9bf0-25f2-4716-bb99-8374f07ec1ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.757795 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.757832 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.757846 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnwck\" (UniqueName: \"kubernetes.io/projected/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-kube-api-access-nnwck\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:23 crc kubenswrapper[4799]: I0319 20:25:23.757859 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.156137 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.221981 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerStarted","Data":"6294282a2f4c313564b449931967d6b48502f0c79c191c77026ad811429a1e36"} Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.226804 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.227681 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bkvvs" event={"ID":"1f9a9bf0-25f2-4716-bb99-8374f07ec1ff","Type":"ContainerDied","Data":"a269f8ab252be6b52a304b020556b5d1d16e801b16cbd136f8f1945a7a0f286b"} Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.227736 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a269f8ab252be6b52a304b020556b5d1d16e801b16cbd136f8f1945a7a0f286b" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.321057 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 20:25:24 crc kubenswrapper[4799]: E0319 20:25:24.321487 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" containerName="nova-cell0-conductor-db-sync" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.321502 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" containerName="nova-cell0-conductor-db-sync" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.321684 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" containerName="nova-cell0-conductor-db-sync" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.322274 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.325251 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9pjf8" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.325265 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.351443 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.368887 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8aa3e34-ca13-45cd-960f-2973a80c80b8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.368993 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgbh\" (UniqueName: \"kubernetes.io/projected/d8aa3e34-ca13-45cd-960f-2973a80c80b8-kube-api-access-vjgbh\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.369057 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8aa3e34-ca13-45cd-960f-2973a80c80b8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.470079 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgbh\" (UniqueName: \"kubernetes.io/projected/d8aa3e34-ca13-45cd-960f-2973a80c80b8-kube-api-access-vjgbh\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.470168 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8aa3e34-ca13-45cd-960f-2973a80c80b8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.470220 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8aa3e34-ca13-45cd-960f-2973a80c80b8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.478201 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8aa3e34-ca13-45cd-960f-2973a80c80b8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.479357 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8aa3e34-ca13-45cd-960f-2973a80c80b8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.491940 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgbh\" (UniqueName: \"kubernetes.io/projected/d8aa3e34-ca13-45cd-960f-2973a80c80b8-kube-api-access-vjgbh\") pod \"nova-cell0-conductor-0\" (UID: \"d8aa3e34-ca13-45cd-960f-2973a80c80b8\") " pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:24 crc kubenswrapper[4799]: I0319 20:25:24.638980 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:25 crc kubenswrapper[4799]: W0319 20:25:25.136375 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8aa3e34_ca13_45cd_960f_2973a80c80b8.slice/crio-a016c39aa0ad19ed26efcc5b4b2f9c999df102670a8400ce592bbff511f1829f WatchSource:0}: Error finding container a016c39aa0ad19ed26efcc5b4b2f9c999df102670a8400ce592bbff511f1829f: Status 404 returned error can't find the container with id a016c39aa0ad19ed26efcc5b4b2f9c999df102670a8400ce592bbff511f1829f Mar 19 20:25:25 crc kubenswrapper[4799]: I0319 20:25:25.139503 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfc744b-9bf8-443a-8078-83505c831553" path="/var/lib/kubelet/pods/ecfc744b-9bf8-443a-8078-83505c831553/volumes" Mar 19 20:25:25 crc kubenswrapper[4799]: I0319 20:25:25.140408 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 20:25:25 crc kubenswrapper[4799]: I0319 20:25:25.253186 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerStarted","Data":"6f627caa8f5379558b680834e38629d020e5d8166f8afd16ddc1993e618e9d4e"} Mar 19 20:25:25 crc kubenswrapper[4799]: I0319 20:25:25.256638 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d8aa3e34-ca13-45cd-960f-2973a80c80b8","Type":"ContainerStarted","Data":"a016c39aa0ad19ed26efcc5b4b2f9c999df102670a8400ce592bbff511f1829f"} Mar 19 20:25:26 crc kubenswrapper[4799]: I0319 20:25:26.270785 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d8aa3e34-ca13-45cd-960f-2973a80c80b8","Type":"ContainerStarted","Data":"fb96ddc06a71e43f8190307a72a22380b2e2cb0b3035776239afed6b948286e3"} Mar 19 20:25:26 crc kubenswrapper[4799]: I0319 20:25:26.271190 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:26 crc kubenswrapper[4799]: I0319 20:25:26.276231 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerStarted","Data":"f0144add50ea70b32cd098981d335a8b5249f18fddca693f0a655ea364a2319b"} Mar 19 20:25:26 crc kubenswrapper[4799]: I0319 20:25:26.276269 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerStarted","Data":"25597ba4ff789adabd2dd2fed33956f15575ec41234dea7f8d03cd5cf66ac66d"} Mar 19 20:25:26 crc kubenswrapper[4799]: I0319 20:25:26.292377 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.292357419 podStartE2EDuration="2.292357419s" podCreationTimestamp="2026-03-19 20:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:26.286440287 +0000 UTC m=+1203.892393359" watchObservedRunningTime="2026-03-19 20:25:26.292357419 +0000 UTC m=+1203.898310491" Mar 19 20:25:28 crc kubenswrapper[4799]: I0319 20:25:28.293713 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerStarted","Data":"17405a49c8b2d234457da915034c70d5143d65c907f5f5ea820e5a3c6fa8891e"} Mar 19 20:25:28 crc kubenswrapper[4799]: I0319 20:25:28.294674 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:25:28 crc kubenswrapper[4799]: I0319 20:25:28.325323 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.680694778 podStartE2EDuration="5.325301965s" podCreationTimestamp="2026-03-19 20:25:23 +0000 UTC" firstStartedPulling="2026-03-19 20:25:24.15951117 +0000 UTC m=+1201.765464272" lastFinishedPulling="2026-03-19 20:25:27.804118367 +0000 UTC m=+1205.410071459" observedRunningTime="2026-03-19 20:25:28.31415527 +0000 UTC m=+1205.920108352" watchObservedRunningTime="2026-03-19 20:25:28.325301965 +0000 UTC m=+1205.931255047" Mar 19 20:25:34 crc kubenswrapper[4799]: I0319 20:25:34.682318 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.192279 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-59gtz"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.193714 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.196165 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.206182 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.219293 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-59gtz"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.283038 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-scripts\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.283096 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-config-data\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.283157 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.283191 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhm6\" (UniqueName: \"kubernetes.io/projected/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-kube-api-access-ljhm6\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.385416 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-scripts\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.385878 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-config-data\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.385959 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.385989 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhm6\" (UniqueName: \"kubernetes.io/projected/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-kube-api-access-ljhm6\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.404689 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhm6\" (UniqueName: \"kubernetes.io/projected/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-kube-api-access-ljhm6\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.409619 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-config-data\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.409655 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.409936 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-scripts\") pod \"nova-cell0-cell-mapping-59gtz\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.446228 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.447853 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.452373 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.460137 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.487169 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9kts\" (UniqueName: \"kubernetes.io/projected/21c12faf-4246-42fa-aad3-031afd1357f2-kube-api-access-l9kts\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.487275 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.487293 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-config-data\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.487396 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c12faf-4246-42fa-aad3-031afd1357f2-logs\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.527035 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.541156 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.542704 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.547870 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.570202 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591334 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c12faf-4246-42fa-aad3-031afd1357f2-logs\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591470 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9kts\" (UniqueName: \"kubernetes.io/projected/21c12faf-4246-42fa-aad3-031afd1357f2-kube-api-access-l9kts\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591631 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591647 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-config-data\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591669 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-config-data\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591730 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrt7f\" (UniqueName: \"kubernetes.io/projected/74a42150-ef0f-4283-9627-a01724dc2a45-kube-api-access-rrt7f\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591797 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.591875 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a42150-ef0f-4283-9627-a01724dc2a45-logs\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.592505 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c12faf-4246-42fa-aad3-031afd1357f2-logs\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.613225 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.623605 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-config-data\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.642130 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9kts\" (UniqueName: \"kubernetes.io/projected/21c12faf-4246-42fa-aad3-031afd1357f2-kube-api-access-l9kts\") pod \"nova-api-0\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.643420 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.644720 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.646605 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.679443 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.697575 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-config-data\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.718201 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrt7f\" (UniqueName: \"kubernetes.io/projected/74a42150-ef0f-4283-9627-a01724dc2a45-kube-api-access-rrt7f\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.718355 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.718571 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a42150-ef0f-4283-9627-a01724dc2a45-logs\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.718751 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.718889 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5qx\" (UniqueName: \"kubernetes.io/projected/fe9fa15c-1183-4a78-9068-64ecc9e4397e-kube-api-access-wr5qx\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.719058 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-config-data\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.703627 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-config-data\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.703675 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-gdxrf"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.723145 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.724263 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a42150-ef0f-4283-9627-a01724dc2a45-logs\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.730023 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.750183 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrt7f\" (UniqueName: \"kubernetes.io/projected/74a42150-ef0f-4283-9627-a01724dc2a45-kube-api-access-rrt7f\") pod \"nova-metadata-0\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.751844 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.759116 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-gdxrf"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.806911 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.808238 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.811539 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.812025 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.827807 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.828449 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-nb\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.828501 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-svc\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.828533 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-config\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.828565 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-swift-storage-0\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.828617 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.829099 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5qx\" (UniqueName: \"kubernetes.io/projected/fe9fa15c-1183-4a78-9068-64ecc9e4397e-kube-api-access-wr5qx\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.829133 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jscqw\" (UniqueName: \"kubernetes.io/projected/eb382d8f-526e-4f8c-bc45-d1afd184fb98-kube-api-access-jscqw\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.829164 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-sb\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.829191 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-config-data\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.833025 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.839132 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-config-data\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.847489 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5qx\" (UniqueName: \"kubernetes.io/projected/fe9fa15c-1183-4a78-9068-64ecc9e4397e-kube-api-access-wr5qx\") pod \"nova-scheduler-0\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931465 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-nb\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931508 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-svc\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931540 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-config\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931567 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931595 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-swift-storage-0\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931673 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2r9t\" (UniqueName: \"kubernetes.io/projected/671cfbdc-6f8c-452e-a771-6a762660c355-kube-api-access-v2r9t\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931700 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931732 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jscqw\" (UniqueName: \"kubernetes.io/projected/eb382d8f-526e-4f8c-bc45-d1afd184fb98-kube-api-access-jscqw\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.931758 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-sb\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.933318 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-config\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.933358 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-nb\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.933403 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-svc\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.933728 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-swift-storage-0\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.934701 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-sb\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:35 crc kubenswrapper[4799]: I0319 20:25:35.950416 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jscqw\" (UniqueName: \"kubernetes.io/projected/eb382d8f-526e-4f8c-bc45-d1afd184fb98-kube-api-access-jscqw\") pod \"dnsmasq-dns-84bb6d55fc-gdxrf\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.033736 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2r9t\" (UniqueName: \"kubernetes.io/projected/671cfbdc-6f8c-452e-a771-6a762660c355-kube-api-access-v2r9t\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.033787 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.033897 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.038715 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.039226 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.052233 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2r9t\" (UniqueName: \"kubernetes.io/projected/671cfbdc-6f8c-452e-a771-6a762660c355-kube-api-access-v2r9t\") pod \"nova-cell1-novncproxy-0\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.066469 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.095673 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.142525 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-59gtz"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.148642 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:36 crc kubenswrapper[4799]: W0319 20:25:36.177969 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c1429cf_e6a9_4c99_af55_4ab7d80cd651.slice/crio-110a7ed3e804a8ea936818525f0a9bc8d0db087f9500ce2ba4168786634a4a43 WatchSource:0}: Error finding container 110a7ed3e804a8ea936818525f0a9bc8d0db087f9500ce2ba4168786634a4a43: Status 404 returned error can't find the container with id 110a7ed3e804a8ea936818525f0a9bc8d0db087f9500ce2ba4168786634a4a43 Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.305665 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.351503 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.384319 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-59gtz" event={"ID":"0c1429cf-e6a9-4c99-af55-4ab7d80cd651","Type":"ContainerStarted","Data":"110a7ed3e804a8ea936818525f0a9bc8d0db087f9500ce2ba4168786634a4a43"} Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.385488 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"21c12faf-4246-42fa-aad3-031afd1357f2","Type":"ContainerStarted","Data":"e66f9d9c03fda8455fb37636960a53f0ae063097ac9dc3dacda812d04c04521a"} Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.386607 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74a42150-ef0f-4283-9627-a01724dc2a45","Type":"ContainerStarted","Data":"203d258f5341b4048dbe480b4a06b0d6aa553b0fc60faf23f247286c88d1da2f"} Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.490526 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ftd7g"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.492452 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.495168 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.495229 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.517475 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ftd7g"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.551453 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-scripts\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.551731 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-config-data\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.552040 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jn7s\" (UniqueName: \"kubernetes.io/projected/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-kube-api-access-6jn7s\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.552250 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.650594 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.656072 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-config-data\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.656320 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jn7s\" (UniqueName: \"kubernetes.io/projected/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-kube-api-access-6jn7s\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.656472 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.656528 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-scripts\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.661306 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.661975 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-scripts\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.664568 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-config-data\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.676891 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jn7s\" (UniqueName: \"kubernetes.io/projected/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-kube-api-access-6jn7s\") pod \"nova-cell1-conductor-db-sync-ftd7g\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.773858 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.782282 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-gdxrf"] Mar 19 20:25:36 crc kubenswrapper[4799]: I0319 20:25:36.828160 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.316430 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ftd7g"] Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.400565 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-59gtz" event={"ID":"0c1429cf-e6a9-4c99-af55-4ab7d80cd651","Type":"ContainerStarted","Data":"6a4e2db20e948f9df36747bb9ee78d065c8a7e573258544955ca98c780dbd65a"} Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.403552 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe9fa15c-1183-4a78-9068-64ecc9e4397e","Type":"ContainerStarted","Data":"bd44b6bf36364d0ebccfba0f8a37187419da6c8b64c44b3a2a1d5722ac068f08"} Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.405415 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"671cfbdc-6f8c-452e-a771-6a762660c355","Type":"ContainerStarted","Data":"97a1aae30921f5166f53b4c519d6b29b9890ff75fee3365577c30ed80b5910c6"} Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.406660 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" event={"ID":"7edd0f4d-c473-43d0-acad-40ba65e1ae5b","Type":"ContainerStarted","Data":"0941baa2abde9a18cee4917a870e95dd1c4ffb0c31bece27a0f1a2b94602d489"} Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.408890 4799 generic.go:334] "Generic (PLEG): container finished" podID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerID="bf91bb519310d3b782d3965b79aec26003904606f8d83c4cf8761dccff940634" exitCode=0 Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.408928 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" event={"ID":"eb382d8f-526e-4f8c-bc45-d1afd184fb98","Type":"ContainerDied","Data":"bf91bb519310d3b782d3965b79aec26003904606f8d83c4cf8761dccff940634"} Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.408970 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" event={"ID":"eb382d8f-526e-4f8c-bc45-d1afd184fb98","Type":"ContainerStarted","Data":"037b3d4e08c200dbba159fe6ec4f24cc013bef0085454f900abd417565e01274"} Mar 19 20:25:37 crc kubenswrapper[4799]: I0319 20:25:37.432840 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-59gtz" podStartSLOduration=2.432819224 podStartE2EDuration="2.432819224s" podCreationTimestamp="2026-03-19 20:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:37.418298997 +0000 UTC m=+1215.024252079" watchObservedRunningTime="2026-03-19 20:25:37.432819224 +0000 UTC m=+1215.038772296" Mar 19 20:25:38 crc kubenswrapper[4799]: I0319 20:25:38.432346 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" event={"ID":"7edd0f4d-c473-43d0-acad-40ba65e1ae5b","Type":"ContainerStarted","Data":"f651de3286bdd3b79e81fe20e4d30b50031f8bcad9801c805b6e7ade9fbe3c1c"} Mar 19 20:25:38 crc kubenswrapper[4799]: I0319 20:25:38.435243 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" event={"ID":"eb382d8f-526e-4f8c-bc45-d1afd184fb98","Type":"ContainerStarted","Data":"3975bc0e0caaa101f99848db60036ac2d9f82a22215243eb052d2a1bec787e85"} Mar 19 20:25:38 crc kubenswrapper[4799]: I0319 20:25:38.457525 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" podStartSLOduration=2.457487567 podStartE2EDuration="2.457487567s" podCreationTimestamp="2026-03-19 20:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:38.448731307 +0000 UTC m=+1216.054684369" watchObservedRunningTime="2026-03-19 20:25:38.457487567 +0000 UTC m=+1216.063440629" Mar 19 20:25:38 crc kubenswrapper[4799]: I0319 20:25:38.470310 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" podStartSLOduration=3.470291607 podStartE2EDuration="3.470291607s" podCreationTimestamp="2026-03-19 20:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:38.465364152 +0000 UTC m=+1216.071317224" watchObservedRunningTime="2026-03-19 20:25:38.470291607 +0000 UTC m=+1216.076244679" Mar 19 20:25:39 crc kubenswrapper[4799]: I0319 20:25:39.159419 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:39 crc kubenswrapper[4799]: I0319 20:25:39.205377 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:25:39 crc kubenswrapper[4799]: I0319 20:25:39.445739 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.453984 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"21c12faf-4246-42fa-aad3-031afd1357f2","Type":"ContainerStarted","Data":"0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a"} Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.454269 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"21c12faf-4246-42fa-aad3-031afd1357f2","Type":"ContainerStarted","Data":"0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711"} Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.457272 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74a42150-ef0f-4283-9627-a01724dc2a45","Type":"ContainerStarted","Data":"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785"} Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.457305 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74a42150-ef0f-4283-9627-a01724dc2a45","Type":"ContainerStarted","Data":"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec"} Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.457508 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-metadata" containerID="cri-o://caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785" gracePeriod=30 Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.457498 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-log" containerID="cri-o://64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec" gracePeriod=30 Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.462891 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe9fa15c-1183-4a78-9068-64ecc9e4397e","Type":"ContainerStarted","Data":"2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653"} Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.467885 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"671cfbdc-6f8c-452e-a771-6a762660c355","Type":"ContainerStarted","Data":"74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030"} Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.467968 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="671cfbdc-6f8c-452e-a771-6a762660c355" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030" gracePeriod=30 Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.478104 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.380493192 podStartE2EDuration="5.478072575s" podCreationTimestamp="2026-03-19 20:25:35 +0000 UTC" firstStartedPulling="2026-03-19 20:25:36.331402462 +0000 UTC m=+1213.937355534" lastFinishedPulling="2026-03-19 20:25:39.428981845 +0000 UTC m=+1217.034934917" observedRunningTime="2026-03-19 20:25:40.473326365 +0000 UTC m=+1218.079279437" watchObservedRunningTime="2026-03-19 20:25:40.478072575 +0000 UTC m=+1218.084025647" Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.491721 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.398283268 podStartE2EDuration="5.491698107s" podCreationTimestamp="2026-03-19 20:25:35 +0000 UTC" firstStartedPulling="2026-03-19 20:25:36.335851764 +0000 UTC m=+1213.941804846" lastFinishedPulling="2026-03-19 20:25:39.429266603 +0000 UTC m=+1217.035219685" observedRunningTime="2026-03-19 20:25:40.487914914 +0000 UTC m=+1218.093868006" watchObservedRunningTime="2026-03-19 20:25:40.491698107 +0000 UTC m=+1218.097651179" Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.519798 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.763955053 podStartE2EDuration="5.519775716s" podCreationTimestamp="2026-03-19 20:25:35 +0000 UTC" firstStartedPulling="2026-03-19 20:25:36.673238704 +0000 UTC m=+1214.279191776" lastFinishedPulling="2026-03-19 20:25:39.429059367 +0000 UTC m=+1217.035012439" observedRunningTime="2026-03-19 20:25:40.510705727 +0000 UTC m=+1218.116658799" watchObservedRunningTime="2026-03-19 20:25:40.519775716 +0000 UTC m=+1218.125728788" Mar 19 20:25:40 crc kubenswrapper[4799]: I0319 20:25:40.531739 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.873002386 podStartE2EDuration="5.531709162s" podCreationTimestamp="2026-03-19 20:25:35 +0000 UTC" firstStartedPulling="2026-03-19 20:25:36.784177619 +0000 UTC m=+1214.390130691" lastFinishedPulling="2026-03-19 20:25:39.442884395 +0000 UTC m=+1217.048837467" observedRunningTime="2026-03-19 20:25:40.527172658 +0000 UTC m=+1218.133125730" watchObservedRunningTime="2026-03-19 20:25:40.531709162 +0000 UTC m=+1218.137662234" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.067848 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.089877 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.149688 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.152366 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-combined-ca-bundle\") pod \"74a42150-ef0f-4283-9627-a01724dc2a45\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.152452 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-config-data\") pod \"74a42150-ef0f-4283-9627-a01724dc2a45\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.152597 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrt7f\" (UniqueName: \"kubernetes.io/projected/74a42150-ef0f-4283-9627-a01724dc2a45-kube-api-access-rrt7f\") pod \"74a42150-ef0f-4283-9627-a01724dc2a45\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.152737 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a42150-ef0f-4283-9627-a01724dc2a45-logs\") pod \"74a42150-ef0f-4283-9627-a01724dc2a45\" (UID: \"74a42150-ef0f-4283-9627-a01724dc2a45\") " Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.153157 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74a42150-ef0f-4283-9627-a01724dc2a45-logs" (OuterVolumeSpecName: "logs") pod "74a42150-ef0f-4283-9627-a01724dc2a45" (UID: "74a42150-ef0f-4283-9627-a01724dc2a45"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.153629 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74a42150-ef0f-4283-9627-a01724dc2a45-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.162640 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a42150-ef0f-4283-9627-a01724dc2a45-kube-api-access-rrt7f" (OuterVolumeSpecName: "kube-api-access-rrt7f") pod "74a42150-ef0f-4283-9627-a01724dc2a45" (UID: "74a42150-ef0f-4283-9627-a01724dc2a45"). InnerVolumeSpecName "kube-api-access-rrt7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.180246 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74a42150-ef0f-4283-9627-a01724dc2a45" (UID: "74a42150-ef0f-4283-9627-a01724dc2a45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.189582 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-config-data" (OuterVolumeSpecName: "config-data") pod "74a42150-ef0f-4283-9627-a01724dc2a45" (UID: "74a42150-ef0f-4283-9627-a01724dc2a45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.255108 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.255142 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74a42150-ef0f-4283-9627-a01724dc2a45-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.255153 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrt7f\" (UniqueName: \"kubernetes.io/projected/74a42150-ef0f-4283-9627-a01724dc2a45-kube-api-access-rrt7f\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.489970 4799 generic.go:334] "Generic (PLEG): container finished" podID="74a42150-ef0f-4283-9627-a01724dc2a45" containerID="caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785" exitCode=0 Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.490008 4799 generic.go:334] "Generic (PLEG): container finished" podID="74a42150-ef0f-4283-9627-a01724dc2a45" containerID="64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec" exitCode=143 Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.490062 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.490060 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74a42150-ef0f-4283-9627-a01724dc2a45","Type":"ContainerDied","Data":"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785"} Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.490130 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74a42150-ef0f-4283-9627-a01724dc2a45","Type":"ContainerDied","Data":"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec"} Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.490154 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74a42150-ef0f-4283-9627-a01724dc2a45","Type":"ContainerDied","Data":"203d258f5341b4048dbe480b4a06b0d6aa553b0fc60faf23f247286c88d1da2f"} Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.490185 4799 scope.go:117] "RemoveContainer" containerID="caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.563027 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.570892 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.577478 4799 scope.go:117] "RemoveContainer" containerID="64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.595879 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:41 crc kubenswrapper[4799]: E0319 20:25:41.596499 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-metadata" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.596520 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-metadata" Mar 19 20:25:41 crc kubenswrapper[4799]: E0319 20:25:41.596542 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-log" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.596553 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-log" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.596765 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-log" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.596801 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" containerName="nova-metadata-metadata" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.597945 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.602356 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.604229 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.608928 4799 scope.go:117] "RemoveContainer" containerID="caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.609432 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:41 crc kubenswrapper[4799]: E0319 20:25:41.609465 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785\": container with ID starting with caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785 not found: ID does not exist" containerID="caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.609511 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785"} err="failed to get container status \"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785\": rpc error: code = NotFound desc = could not find container \"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785\": container with ID starting with caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785 not found: ID does not exist" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.609548 4799 scope.go:117] "RemoveContainer" containerID="64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec" Mar 19 20:25:41 crc kubenswrapper[4799]: E0319 20:25:41.609998 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec\": container with ID starting with 64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec not found: ID does not exist" containerID="64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.610031 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec"} err="failed to get container status \"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec\": rpc error: code = NotFound desc = could not find container \"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec\": container with ID starting with 64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec not found: ID does not exist" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.610056 4799 scope.go:117] "RemoveContainer" containerID="caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.612192 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785"} err="failed to get container status \"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785\": rpc error: code = NotFound desc = could not find container \"caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785\": container with ID starting with caea2adda7e0ff8dffa2d18c20c6200c6925bfab91c43294fbb90defcd25d785 not found: ID does not exist" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.612235 4799 scope.go:117] "RemoveContainer" containerID="64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.612675 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec"} err="failed to get container status \"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec\": rpc error: code = NotFound desc = could not find container \"64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec\": container with ID starting with 64614146c57890a8f6078d1d94c7a1867a98383b90457fb2050742c8c5419bec not found: ID does not exist" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.665825 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.665921 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l767w\" (UniqueName: \"kubernetes.io/projected/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-kube-api-access-l767w\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.666143 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.666358 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-logs\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.666420 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-config-data\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.768115 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.768224 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-logs\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.768250 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-config-data\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.768286 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.768332 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l767w\" (UniqueName: \"kubernetes.io/projected/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-kube-api-access-l767w\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.769177 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-logs\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.773406 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.773806 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-config-data\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.774013 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.790761 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l767w\" (UniqueName: \"kubernetes.io/projected/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-kube-api-access-l767w\") pod \"nova-metadata-0\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " pod="openstack/nova-metadata-0" Mar 19 20:25:41 crc kubenswrapper[4799]: I0319 20:25:41.935644 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:42 crc kubenswrapper[4799]: I0319 20:25:42.436093 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:42 crc kubenswrapper[4799]: W0319 20:25:42.438809 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25e1de81_5ba2_4f6c_8a25_60f6db7a2bef.slice/crio-f113ef362069204a31ec05010e47f22249b6beb321fd0633e9c7115877d6de4b WatchSource:0}: Error finding container f113ef362069204a31ec05010e47f22249b6beb321fd0633e9c7115877d6de4b: Status 404 returned error can't find the container with id f113ef362069204a31ec05010e47f22249b6beb321fd0633e9c7115877d6de4b Mar 19 20:25:42 crc kubenswrapper[4799]: I0319 20:25:42.523171 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef","Type":"ContainerStarted","Data":"f113ef362069204a31ec05010e47f22249b6beb321fd0633e9c7115877d6de4b"} Mar 19 20:25:43 crc kubenswrapper[4799]: I0319 20:25:43.136014 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a42150-ef0f-4283-9627-a01724dc2a45" path="/var/lib/kubelet/pods/74a42150-ef0f-4283-9627-a01724dc2a45/volumes" Mar 19 20:25:43 crc kubenswrapper[4799]: I0319 20:25:43.538347 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef","Type":"ContainerStarted","Data":"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f"} Mar 19 20:25:43 crc kubenswrapper[4799]: I0319 20:25:43.538444 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef","Type":"ContainerStarted","Data":"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c"} Mar 19 20:25:43 crc kubenswrapper[4799]: I0319 20:25:43.540779 4799 generic.go:334] "Generic (PLEG): container finished" podID="0c1429cf-e6a9-4c99-af55-4ab7d80cd651" containerID="6a4e2db20e948f9df36747bb9ee78d065c8a7e573258544955ca98c780dbd65a" exitCode=0 Mar 19 20:25:43 crc kubenswrapper[4799]: I0319 20:25:43.540824 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-59gtz" event={"ID":"0c1429cf-e6a9-4c99-af55-4ab7d80cd651","Type":"ContainerDied","Data":"6a4e2db20e948f9df36747bb9ee78d065c8a7e573258544955ca98c780dbd65a"} Mar 19 20:25:43 crc kubenswrapper[4799]: I0319 20:25:43.577623 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5776038999999997 podStartE2EDuration="2.5776039s" podCreationTimestamp="2026-03-19 20:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:43.56080312 +0000 UTC m=+1221.166756192" watchObservedRunningTime="2026-03-19 20:25:43.5776039 +0000 UTC m=+1221.183556972" Mar 19 20:25:44 crc kubenswrapper[4799]: I0319 20:25:44.566897 4799 generic.go:334] "Generic (PLEG): container finished" podID="7edd0f4d-c473-43d0-acad-40ba65e1ae5b" containerID="f651de3286bdd3b79e81fe20e4d30b50031f8bcad9801c805b6e7ade9fbe3c1c" exitCode=0 Mar 19 20:25:44 crc kubenswrapper[4799]: I0319 20:25:44.567036 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" event={"ID":"7edd0f4d-c473-43d0-acad-40ba65e1ae5b","Type":"ContainerDied","Data":"f651de3286bdd3b79e81fe20e4d30b50031f8bcad9801c805b6e7ade9fbe3c1c"} Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.067049 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.133656 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-scripts\") pod \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.133826 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-combined-ca-bundle\") pod \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.133903 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-config-data\") pod \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.133941 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhm6\" (UniqueName: \"kubernetes.io/projected/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-kube-api-access-ljhm6\") pod \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\" (UID: \"0c1429cf-e6a9-4c99-af55-4ab7d80cd651\") " Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.143777 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-scripts" (OuterVolumeSpecName: "scripts") pod "0c1429cf-e6a9-4c99-af55-4ab7d80cd651" (UID: "0c1429cf-e6a9-4c99-af55-4ab7d80cd651"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.153691 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-kube-api-access-ljhm6" (OuterVolumeSpecName: "kube-api-access-ljhm6") pod "0c1429cf-e6a9-4c99-af55-4ab7d80cd651" (UID: "0c1429cf-e6a9-4c99-af55-4ab7d80cd651"). InnerVolumeSpecName "kube-api-access-ljhm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.192317 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-config-data" (OuterVolumeSpecName: "config-data") pod "0c1429cf-e6a9-4c99-af55-4ab7d80cd651" (UID: "0c1429cf-e6a9-4c99-af55-4ab7d80cd651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.197904 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c1429cf-e6a9-4c99-af55-4ab7d80cd651" (UID: "0c1429cf-e6a9-4c99-af55-4ab7d80cd651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.235541 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.235584 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhm6\" (UniqueName: \"kubernetes.io/projected/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-kube-api-access-ljhm6\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.235599 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.235612 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c1429cf-e6a9-4c99-af55-4ab7d80cd651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.585067 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-59gtz" event={"ID":"0c1429cf-e6a9-4c99-af55-4ab7d80cd651","Type":"ContainerDied","Data":"110a7ed3e804a8ea936818525f0a9bc8d0db087f9500ce2ba4168786634a4a43"} Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.585111 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-59gtz" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.585122 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110a7ed3e804a8ea936818525f0a9bc8d0db087f9500ce2ba4168786634a4a43" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.812076 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.812987 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.813034 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.813155 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-log" containerID="cri-o://0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711" gracePeriod=30 Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.813892 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-api" containerID="cri-o://0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a" gracePeriod=30 Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.831122 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.831176 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": EOF" Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.858462 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.858736 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fe9fa15c-1183-4a78-9068-64ecc9e4397e" containerName="nova-scheduler-scheduler" containerID="cri-o://2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653" gracePeriod=30 Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.881733 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.882891 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-log" containerID="cri-o://73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c" gracePeriod=30 Mar 19 20:25:45 crc kubenswrapper[4799]: I0319 20:25:45.883447 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-metadata" containerID="cri-o://4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f" gracePeriod=30 Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.097595 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.163197 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-9gfg8"] Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.164641 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" containerName="dnsmasq-dns" containerID="cri-o://a44643ffc07120d1e32ea8fd0ade77e4f2ce18b92b138b99a379874b44443eca" gracePeriod=10 Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.178962 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.366428 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-scripts\") pod \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.366604 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jn7s\" (UniqueName: \"kubernetes.io/projected/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-kube-api-access-6jn7s\") pod \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.366676 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-combined-ca-bundle\") pod \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.366713 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-config-data\") pod \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\" (UID: \"7edd0f4d-c473-43d0-acad-40ba65e1ae5b\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.372694 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-kube-api-access-6jn7s" (OuterVolumeSpecName: "kube-api-access-6jn7s") pod "7edd0f4d-c473-43d0-acad-40ba65e1ae5b" (UID: "7edd0f4d-c473-43d0-acad-40ba65e1ae5b"). InnerVolumeSpecName "kube-api-access-6jn7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.375450 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-scripts" (OuterVolumeSpecName: "scripts") pod "7edd0f4d-c473-43d0-acad-40ba65e1ae5b" (UID: "7edd0f4d-c473-43d0-acad-40ba65e1ae5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.398514 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-config-data" (OuterVolumeSpecName: "config-data") pod "7edd0f4d-c473-43d0-acad-40ba65e1ae5b" (UID: "7edd0f4d-c473-43d0-acad-40ba65e1ae5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.418598 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7edd0f4d-c473-43d0-acad-40ba65e1ae5b" (UID: "7edd0f4d-c473-43d0-acad-40ba65e1ae5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.442676 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.469085 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jn7s\" (UniqueName: \"kubernetes.io/projected/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-kube-api-access-6jn7s\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.469327 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.469398 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.469467 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7edd0f4d-c473-43d0-acad-40ba65e1ae5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.570275 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-nova-metadata-tls-certs\") pod \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.570431 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l767w\" (UniqueName: \"kubernetes.io/projected/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-kube-api-access-l767w\") pod \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.570610 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-logs\") pod \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.570656 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-config-data\") pod \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.570768 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-combined-ca-bundle\") pod \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\" (UID: \"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.570944 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-logs" (OuterVolumeSpecName: "logs") pod "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" (UID: "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.571618 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.573480 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-kube-api-access-l767w" (OuterVolumeSpecName: "kube-api-access-l767w") pod "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" (UID: "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef"). InnerVolumeSpecName "kube-api-access-l767w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.596656 4799 generic.go:334] "Generic (PLEG): container finished" podID="21c12faf-4246-42fa-aad3-031afd1357f2" containerID="0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711" exitCode=143 Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.596722 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"21c12faf-4246-42fa-aad3-031afd1357f2","Type":"ContainerDied","Data":"0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711"} Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.599419 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-config-data" (OuterVolumeSpecName: "config-data") pod "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" (UID: "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.605615 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.605760 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ftd7g" event={"ID":"7edd0f4d-c473-43d0-acad-40ba65e1ae5b","Type":"ContainerDied","Data":"0941baa2abde9a18cee4917a870e95dd1c4ffb0c31bece27a0f1a2b94602d489"} Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.605792 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0941baa2abde9a18cee4917a870e95dd1c4ffb0c31bece27a0f1a2b94602d489" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.607923 4799 generic.go:334] "Generic (PLEG): container finished" podID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerID="4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f" exitCode=0 Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.607947 4799 generic.go:334] "Generic (PLEG): container finished" podID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerID="73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c" exitCode=143 Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.607988 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef","Type":"ContainerDied","Data":"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f"} Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.608013 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef","Type":"ContainerDied","Data":"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c"} Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.608022 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25e1de81-5ba2-4f6c-8a25-60f6db7a2bef","Type":"ContainerDied","Data":"f113ef362069204a31ec05010e47f22249b6beb321fd0633e9c7115877d6de4b"} Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.608036 4799 scope.go:117] "RemoveContainer" containerID="4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.608155 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.618879 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" (UID: "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.622471 4799 generic.go:334] "Generic (PLEG): container finished" podID="68702358-3d14-4012-9f8d-cecd4517ced7" containerID="a44643ffc07120d1e32ea8fd0ade77e4f2ce18b92b138b99a379874b44443eca" exitCode=0 Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.622521 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" event={"ID":"68702358-3d14-4012-9f8d-cecd4517ced7","Type":"ContainerDied","Data":"a44643ffc07120d1e32ea8fd0ade77e4f2ce18b92b138b99a379874b44443eca"} Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.638651 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.641449 4799 scope.go:117] "RemoveContainer" containerID="73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.648956 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" (UID: "25e1de81-5ba2-4f6c-8a25-60f6db7a2bef"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.672917 4799 scope.go:117] "RemoveContainer" containerID="4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.673407 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f\": container with ID starting with 4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f not found: ID does not exist" containerID="4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.673458 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f"} err="failed to get container status \"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f\": rpc error: code = NotFound desc = could not find container \"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f\": container with ID starting with 4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f not found: ID does not exist" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.673487 4799 scope.go:117] "RemoveContainer" containerID="73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.673815 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c\": container with ID starting with 73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c not found: ID does not exist" containerID="73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.673857 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c"} err="failed to get container status \"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c\": rpc error: code = NotFound desc = could not find container \"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c\": container with ID starting with 73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c not found: ID does not exist" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.673889 4799 scope.go:117] "RemoveContainer" containerID="4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.675558 4799 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.675583 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l767w\" (UniqueName: \"kubernetes.io/projected/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-kube-api-access-l767w\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.675596 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.675609 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.678496 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f"} err="failed to get container status \"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f\": rpc error: code = NotFound desc = could not find container \"4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f\": container with ID starting with 4343777ad8a0a83418560036106b9d4ec866510322dd86468c7ce0cb5a3c594f not found: ID does not exist" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.678527 4799 scope.go:117] "RemoveContainer" containerID="73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.685741 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c"} err="failed to get container status \"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c\": rpc error: code = NotFound desc = could not find container \"73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c\": container with ID starting with 73ed26cb3d9d92e8e9e0c5c89a58507cf81868c26fdc226ec5e665737663ef7c not found: ID does not exist" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.706952 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.707426 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edd0f4d-c473-43d0-acad-40ba65e1ae5b" containerName="nova-cell1-conductor-db-sync" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707444 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edd0f4d-c473-43d0-acad-40ba65e1ae5b" containerName="nova-cell1-conductor-db-sync" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.707458 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-metadata" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707464 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-metadata" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.707478 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-log" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707484 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-log" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.707499 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1429cf-e6a9-4c99-af55-4ab7d80cd651" containerName="nova-manage" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707505 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1429cf-e6a9-4c99-af55-4ab7d80cd651" containerName="nova-manage" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.707522 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" containerName="dnsmasq-dns" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707527 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" containerName="dnsmasq-dns" Mar 19 20:25:46 crc kubenswrapper[4799]: E0319 20:25:46.707541 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" containerName="init" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707547 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" containerName="init" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707695 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edd0f4d-c473-43d0-acad-40ba65e1ae5b" containerName="nova-cell1-conductor-db-sync" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707712 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" containerName="dnsmasq-dns" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707725 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1429cf-e6a9-4c99-af55-4ab7d80cd651" containerName="nova-manage" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707742 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-metadata" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.707754 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" containerName="nova-metadata-log" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.708366 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.712582 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.723715 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.776561 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-sb\") pod \"68702358-3d14-4012-9f8d-cecd4517ced7\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.776606 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-config\") pod \"68702358-3d14-4012-9f8d-cecd4517ced7\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777000 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-swift-storage-0\") pod \"68702358-3d14-4012-9f8d-cecd4517ced7\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777094 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-nb\") pod \"68702358-3d14-4012-9f8d-cecd4517ced7\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777210 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-svc\") pod \"68702358-3d14-4012-9f8d-cecd4517ced7\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777263 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/68702358-3d14-4012-9f8d-cecd4517ced7-kube-api-access-42bnj\") pod \"68702358-3d14-4012-9f8d-cecd4517ced7\" (UID: \"68702358-3d14-4012-9f8d-cecd4517ced7\") " Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777722 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975gg\" (UniqueName: \"kubernetes.io/projected/df1146cf-6649-441e-b17f-bfbebbdc3439-kube-api-access-975gg\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777814 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df1146cf-6649-441e-b17f-bfbebbdc3439-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.777843 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1146cf-6649-441e-b17f-bfbebbdc3439-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.781092 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68702358-3d14-4012-9f8d-cecd4517ced7-kube-api-access-42bnj" (OuterVolumeSpecName: "kube-api-access-42bnj") pod "68702358-3d14-4012-9f8d-cecd4517ced7" (UID: "68702358-3d14-4012-9f8d-cecd4517ced7"). InnerVolumeSpecName "kube-api-access-42bnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.829418 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68702358-3d14-4012-9f8d-cecd4517ced7" (UID: "68702358-3d14-4012-9f8d-cecd4517ced7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.835318 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68702358-3d14-4012-9f8d-cecd4517ced7" (UID: "68702358-3d14-4012-9f8d-cecd4517ced7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.843436 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-config" (OuterVolumeSpecName: "config") pod "68702358-3d14-4012-9f8d-cecd4517ced7" (UID: "68702358-3d14-4012-9f8d-cecd4517ced7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.844915 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68702358-3d14-4012-9f8d-cecd4517ced7" (UID: "68702358-3d14-4012-9f8d-cecd4517ced7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.846315 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "68702358-3d14-4012-9f8d-cecd4517ced7" (UID: "68702358-3d14-4012-9f8d-cecd4517ced7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.879472 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df1146cf-6649-441e-b17f-bfbebbdc3439-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.879562 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1146cf-6649-441e-b17f-bfbebbdc3439-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880080 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-975gg\" (UniqueName: \"kubernetes.io/projected/df1146cf-6649-441e-b17f-bfbebbdc3439-kube-api-access-975gg\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880172 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880193 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42bnj\" (UniqueName: \"kubernetes.io/projected/68702358-3d14-4012-9f8d-cecd4517ced7-kube-api-access-42bnj\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880209 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880219 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880231 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.880245 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68702358-3d14-4012-9f8d-cecd4517ced7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.883506 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df1146cf-6649-441e-b17f-bfbebbdc3439-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.883747 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df1146cf-6649-441e-b17f-bfbebbdc3439-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.894677 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-975gg\" (UniqueName: \"kubernetes.io/projected/df1146cf-6649-441e-b17f-bfbebbdc3439-kube-api-access-975gg\") pod \"nova-cell1-conductor-0\" (UID: \"df1146cf-6649-441e-b17f-bfbebbdc3439\") " pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.944035 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.961106 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.977348 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.979604 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.981403 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.982112 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.982208 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dda1e2e-c642-491e-ba96-26e9638bc902-logs\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.982283 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-config-data\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:46 crc kubenswrapper[4799]: I0319 20:25:46.982303 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-824r6\" (UniqueName: \"kubernetes.io/projected/4dda1e2e-c642-491e-ba96-26e9638bc902-kube-api-access-824r6\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:46.991421 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:46.997115 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.000747 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.026681 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.085847 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.085907 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.085944 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dda1e2e-c642-491e-ba96-26e9638bc902-logs\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.085979 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-config-data\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.085996 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-824r6\" (UniqueName: \"kubernetes.io/projected/4dda1e2e-c642-491e-ba96-26e9638bc902-kube-api-access-824r6\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.086745 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dda1e2e-c642-491e-ba96-26e9638bc902-logs\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.090124 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.091581 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-config-data\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.097363 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.113979 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-824r6\" (UniqueName: \"kubernetes.io/projected/4dda1e2e-c642-491e-ba96-26e9638bc902-kube-api-access-824r6\") pod \"nova-metadata-0\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.139735 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e1de81-5ba2-4f6c-8a25-60f6db7a2bef" path="/var/lib/kubelet/pods/25e1de81-5ba2-4f6c-8a25-60f6db7a2bef/volumes" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.192285 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.328007 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.505501 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-combined-ca-bundle\") pod \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.506069 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr5qx\" (UniqueName: \"kubernetes.io/projected/fe9fa15c-1183-4a78-9068-64ecc9e4397e-kube-api-access-wr5qx\") pod \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.506128 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-config-data\") pod \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\" (UID: \"fe9fa15c-1183-4a78-9068-64ecc9e4397e\") " Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.512509 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9fa15c-1183-4a78-9068-64ecc9e4397e-kube-api-access-wr5qx" (OuterVolumeSpecName: "kube-api-access-wr5qx") pod "fe9fa15c-1183-4a78-9068-64ecc9e4397e" (UID: "fe9fa15c-1183-4a78-9068-64ecc9e4397e"). InnerVolumeSpecName "kube-api-access-wr5qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.515123 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.542802 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe9fa15c-1183-4a78-9068-64ecc9e4397e" (UID: "fe9fa15c-1183-4a78-9068-64ecc9e4397e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.543290 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-config-data" (OuterVolumeSpecName: "config-data") pod "fe9fa15c-1183-4a78-9068-64ecc9e4397e" (UID: "fe9fa15c-1183-4a78-9068-64ecc9e4397e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.609229 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr5qx\" (UniqueName: \"kubernetes.io/projected/fe9fa15c-1183-4a78-9068-64ecc9e4397e-kube-api-access-wr5qx\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.609283 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.609307 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9fa15c-1183-4a78-9068-64ecc9e4397e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.633407 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.633432 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-9gfg8" event={"ID":"68702358-3d14-4012-9f8d-cecd4517ced7","Type":"ContainerDied","Data":"701474f48885379aa28bf4a970eec2c1f339224e976ab6051e9d619863f3c05d"} Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.633833 4799 scope.go:117] "RemoveContainer" containerID="a44643ffc07120d1e32ea8fd0ade77e4f2ce18b92b138b99a379874b44443eca" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.634629 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df1146cf-6649-441e-b17f-bfbebbdc3439","Type":"ContainerStarted","Data":"b83f82697d1b2a5fe43978333fe8b9bd701e2901a5b02b4ad88c3b5c320bba97"} Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.640069 4799 generic.go:334] "Generic (PLEG): container finished" podID="fe9fa15c-1183-4a78-9068-64ecc9e4397e" containerID="2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653" exitCode=0 Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.640203 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe9fa15c-1183-4a78-9068-64ecc9e4397e","Type":"ContainerDied","Data":"2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653"} Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.640297 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fe9fa15c-1183-4a78-9068-64ecc9e4397e","Type":"ContainerDied","Data":"bd44b6bf36364d0ebccfba0f8a37187419da6c8b64c44b3a2a1d5722ac068f08"} Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.640411 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.658352 4799 scope.go:117] "RemoveContainer" containerID="691d8ab6cf09593e55fec25314b3e29bc313d928f046f80a539491400ca9802e" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.704888 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-9gfg8"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.715010 4799 scope.go:117] "RemoveContainer" containerID="2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.730515 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-9gfg8"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.739625 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.745049 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.751666 4799 scope.go:117] "RemoveContainer" containerID="2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653" Mar 19 20:25:47 crc kubenswrapper[4799]: E0319 20:25:47.752266 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653\": container with ID starting with 2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653 not found: ID does not exist" containerID="2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.752318 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653"} err="failed to get container status \"2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653\": rpc error: code = NotFound desc = could not find container \"2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653\": container with ID starting with 2ff1eec72b1652aa96c96711b3e2e654256acecc0c6baf40e981437f13614653 not found: ID does not exist" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.754366 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: E0319 20:25:47.754742 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe9fa15c-1183-4a78-9068-64ecc9e4397e" containerName="nova-scheduler-scheduler" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.754762 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9fa15c-1183-4a78-9068-64ecc9e4397e" containerName="nova-scheduler-scheduler" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.754917 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe9fa15c-1183-4a78-9068-64ecc9e4397e" containerName="nova-scheduler-scheduler" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.755518 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.757596 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.767664 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.781107 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.814253 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-config-data\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.814324 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5478\" (UniqueName: \"kubernetes.io/projected/e2555dac-6f8e-4fb5-9281-f21c939c8077-kube-api-access-b5478\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.814443 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.916186 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5478\" (UniqueName: \"kubernetes.io/projected/e2555dac-6f8e-4fb5-9281-f21c939c8077-kube-api-access-b5478\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.916256 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.916461 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-config-data\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.921946 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.922219 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-config-data\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:47 crc kubenswrapper[4799]: I0319 20:25:47.932576 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5478\" (UniqueName: \"kubernetes.io/projected/e2555dac-6f8e-4fb5-9281-f21c939c8077-kube-api-access-b5478\") pod \"nova-scheduler-0\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " pod="openstack/nova-scheduler-0" Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.079232 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.356265 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:25:48 crc kubenswrapper[4799]: W0319 20:25:48.368047 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2555dac_6f8e_4fb5_9281_f21c939c8077.slice/crio-9b036b39f1171f5031ff4125e2816942689c488e5d16bdd7f641151d613c436a WatchSource:0}: Error finding container 9b036b39f1171f5031ff4125e2816942689c488e5d16bdd7f641151d613c436a: Status 404 returned error can't find the container with id 9b036b39f1171f5031ff4125e2816942689c488e5d16bdd7f641151d613c436a Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.654968 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2555dac-6f8e-4fb5-9281-f21c939c8077","Type":"ContainerStarted","Data":"58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86"} Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.655246 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2555dac-6f8e-4fb5-9281-f21c939c8077","Type":"ContainerStarted","Data":"9b036b39f1171f5031ff4125e2816942689c488e5d16bdd7f641151d613c436a"} Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.656605 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4dda1e2e-c642-491e-ba96-26e9638bc902","Type":"ContainerStarted","Data":"fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c"} Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.656626 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4dda1e2e-c642-491e-ba96-26e9638bc902","Type":"ContainerStarted","Data":"ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476"} Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.656634 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4dda1e2e-c642-491e-ba96-26e9638bc902","Type":"ContainerStarted","Data":"c370cc172844760b98db0917b14890e91f070b597dfca14ce691a2ce781f975b"} Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.660580 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"df1146cf-6649-441e-b17f-bfbebbdc3439","Type":"ContainerStarted","Data":"efa888969aa4bbcc9fa925dfd08705eaa2629a9890f49d011f0276374a70323a"} Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.660695 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.685024 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.685006706 podStartE2EDuration="1.685006706s" podCreationTimestamp="2026-03-19 20:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:48.672199086 +0000 UTC m=+1226.278152158" watchObservedRunningTime="2026-03-19 20:25:48.685006706 +0000 UTC m=+1226.290959778" Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.692976 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.692961044 podStartE2EDuration="2.692961044s" podCreationTimestamp="2026-03-19 20:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:48.687996518 +0000 UTC m=+1226.293949590" watchObservedRunningTime="2026-03-19 20:25:48.692961044 +0000 UTC m=+1226.298914116" Mar 19 20:25:48 crc kubenswrapper[4799]: I0319 20:25:48.711667 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.711650445 podStartE2EDuration="2.711650445s" podCreationTimestamp="2026-03-19 20:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:48.704595312 +0000 UTC m=+1226.310548384" watchObservedRunningTime="2026-03-19 20:25:48.711650445 +0000 UTC m=+1226.317603517" Mar 19 20:25:49 crc kubenswrapper[4799]: I0319 20:25:49.134346 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68702358-3d14-4012-9f8d-cecd4517ced7" path="/var/lib/kubelet/pods/68702358-3d14-4012-9f8d-cecd4517ced7/volumes" Mar 19 20:25:49 crc kubenswrapper[4799]: I0319 20:25:49.135950 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe9fa15c-1183-4a78-9068-64ecc9e4397e" path="/var/lib/kubelet/pods/fe9fa15c-1183-4a78-9068-64ecc9e4397e/volumes" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.063295 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.706549 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.752552 4799 generic.go:334] "Generic (PLEG): container finished" podID="21c12faf-4246-42fa-aad3-031afd1357f2" containerID="0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a" exitCode=0 Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.752603 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"21c12faf-4246-42fa-aad3-031afd1357f2","Type":"ContainerDied","Data":"0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a"} Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.752635 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"21c12faf-4246-42fa-aad3-031afd1357f2","Type":"ContainerDied","Data":"e66f9d9c03fda8455fb37636960a53f0ae063097ac9dc3dacda812d04c04521a"} Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.752655 4799 scope.go:117] "RemoveContainer" containerID="0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.752817 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.779560 4799 scope.go:117] "RemoveContainer" containerID="0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.801220 4799 scope.go:117] "RemoveContainer" containerID="0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a" Mar 19 20:25:52 crc kubenswrapper[4799]: E0319 20:25:52.801657 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a\": container with ID starting with 0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a not found: ID does not exist" containerID="0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.801806 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a"} err="failed to get container status \"0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a\": rpc error: code = NotFound desc = could not find container \"0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a\": container with ID starting with 0739cfeef855ee85572d0112834f6d5b5a3ba9fc3bf241f7b6f3823b375a134a not found: ID does not exist" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.801920 4799 scope.go:117] "RemoveContainer" containerID="0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711" Mar 19 20:25:52 crc kubenswrapper[4799]: E0319 20:25:52.802286 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711\": container with ID starting with 0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711 not found: ID does not exist" containerID="0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.802525 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711"} err="failed to get container status \"0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711\": rpc error: code = NotFound desc = could not find container \"0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711\": container with ID starting with 0cdfb93b342ac62fe92cd81e19d7e1c6d689dfbff4eb4db9d255f1399970a711 not found: ID does not exist" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.847051 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9kts\" (UniqueName: \"kubernetes.io/projected/21c12faf-4246-42fa-aad3-031afd1357f2-kube-api-access-l9kts\") pod \"21c12faf-4246-42fa-aad3-031afd1357f2\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.847891 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-config-data\") pod \"21c12faf-4246-42fa-aad3-031afd1357f2\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.848035 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-combined-ca-bundle\") pod \"21c12faf-4246-42fa-aad3-031afd1357f2\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.848086 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c12faf-4246-42fa-aad3-031afd1357f2-logs\") pod \"21c12faf-4246-42fa-aad3-031afd1357f2\" (UID: \"21c12faf-4246-42fa-aad3-031afd1357f2\") " Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.848846 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c12faf-4246-42fa-aad3-031afd1357f2-logs" (OuterVolumeSpecName: "logs") pod "21c12faf-4246-42fa-aad3-031afd1357f2" (UID: "21c12faf-4246-42fa-aad3-031afd1357f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.852886 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c12faf-4246-42fa-aad3-031afd1357f2-kube-api-access-l9kts" (OuterVolumeSpecName: "kube-api-access-l9kts") pod "21c12faf-4246-42fa-aad3-031afd1357f2" (UID: "21c12faf-4246-42fa-aad3-031afd1357f2"). InnerVolumeSpecName "kube-api-access-l9kts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.878479 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21c12faf-4246-42fa-aad3-031afd1357f2" (UID: "21c12faf-4246-42fa-aad3-031afd1357f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.889337 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-config-data" (OuterVolumeSpecName: "config-data") pod "21c12faf-4246-42fa-aad3-031afd1357f2" (UID: "21c12faf-4246-42fa-aad3-031afd1357f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.951085 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.951143 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c12faf-4246-42fa-aad3-031afd1357f2-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.951164 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9kts\" (UniqueName: \"kubernetes.io/projected/21c12faf-4246-42fa-aad3-031afd1357f2-kube-api-access-l9kts\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:52 crc kubenswrapper[4799]: I0319 20:25:52.951184 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c12faf-4246-42fa-aad3-031afd1357f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.079467 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.144379 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.196313 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.209905 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:53 crc kubenswrapper[4799]: E0319 20:25:53.210347 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-api" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.210377 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-api" Mar 19 20:25:53 crc kubenswrapper[4799]: E0319 20:25:53.210461 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-log" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.210475 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-log" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.210821 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-log" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.210884 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" containerName="nova-api-api" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.212804 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.215495 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.221692 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.359259 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.359304 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrdd\" (UniqueName: \"kubernetes.io/projected/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-kube-api-access-qtrdd\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.359580 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-logs\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.359855 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-config-data\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.461562 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.461605 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrdd\" (UniqueName: \"kubernetes.io/projected/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-kube-api-access-qtrdd\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.461676 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-logs\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.461757 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-config-data\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.462208 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-logs\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.468257 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-config-data\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.471269 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.492575 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrdd\" (UniqueName: \"kubernetes.io/projected/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-kube-api-access-qtrdd\") pod \"nova-api-0\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.534418 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:25:53 crc kubenswrapper[4799]: I0319 20:25:53.700362 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 20:25:54 crc kubenswrapper[4799]: W0319 20:25:54.052763 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cb125fe_e5f3_4190_80b1_fc10fdfee7d1.slice/crio-aa7081ac2154da179f7122da8b260fea09490c151d41220c80c11b53adcecf1b WatchSource:0}: Error finding container aa7081ac2154da179f7122da8b260fea09490c151d41220c80c11b53adcecf1b: Status 404 returned error can't find the container with id aa7081ac2154da179f7122da8b260fea09490c151d41220c80c11b53adcecf1b Mar 19 20:25:54 crc kubenswrapper[4799]: I0319 20:25:54.064087 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:25:54 crc kubenswrapper[4799]: I0319 20:25:54.779672 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1","Type":"ContainerStarted","Data":"5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79"} Mar 19 20:25:54 crc kubenswrapper[4799]: I0319 20:25:54.780039 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1","Type":"ContainerStarted","Data":"2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e"} Mar 19 20:25:54 crc kubenswrapper[4799]: I0319 20:25:54.780062 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1","Type":"ContainerStarted","Data":"aa7081ac2154da179f7122da8b260fea09490c151d41220c80c11b53adcecf1b"} Mar 19 20:25:54 crc kubenswrapper[4799]: I0319 20:25:54.805370 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.805352176 podStartE2EDuration="1.805352176s" podCreationTimestamp="2026-03-19 20:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:25:54.79506953 +0000 UTC m=+1232.401022592" watchObservedRunningTime="2026-03-19 20:25:54.805352176 +0000 UTC m=+1232.411305248" Mar 19 20:25:55 crc kubenswrapper[4799]: I0319 20:25:55.133259 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c12faf-4246-42fa-aad3-031afd1357f2" path="/var/lib/kubelet/pods/21c12faf-4246-42fa-aad3-031afd1357f2/volumes" Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.192719 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.194253 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.426464 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.426700 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1392f44b-030e-4305-aa1e-14d89e1696db" containerName="kube-state-metrics" containerID="cri-o://8c00e5f00da3f3438e958825417727801c2e375dca1b9bbb65ae4b86343917f7" gracePeriod=30 Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.811378 4799 generic.go:334] "Generic (PLEG): container finished" podID="1392f44b-030e-4305-aa1e-14d89e1696db" containerID="8c00e5f00da3f3438e958825417727801c2e375dca1b9bbb65ae4b86343917f7" exitCode=2 Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.812552 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1392f44b-030e-4305-aa1e-14d89e1696db","Type":"ContainerDied","Data":"8c00e5f00da3f3438e958825417727801c2e375dca1b9bbb65ae4b86343917f7"} Mar 19 20:25:57 crc kubenswrapper[4799]: I0319 20:25:57.920828 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.065689 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kffhh\" (UniqueName: \"kubernetes.io/projected/1392f44b-030e-4305-aa1e-14d89e1696db-kube-api-access-kffhh\") pod \"1392f44b-030e-4305-aa1e-14d89e1696db\" (UID: \"1392f44b-030e-4305-aa1e-14d89e1696db\") " Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.074653 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1392f44b-030e-4305-aa1e-14d89e1696db-kube-api-access-kffhh" (OuterVolumeSpecName: "kube-api-access-kffhh") pod "1392f44b-030e-4305-aa1e-14d89e1696db" (UID: "1392f44b-030e-4305-aa1e-14d89e1696db"). InnerVolumeSpecName "kube-api-access-kffhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.079928 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.163103 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.167872 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kffhh\" (UniqueName: \"kubernetes.io/projected/1392f44b-030e-4305-aa1e-14d89e1696db-kube-api-access-kffhh\") on node \"crc\" DevicePath \"\"" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.209531 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.209769 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.827730 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1392f44b-030e-4305-aa1e-14d89e1696db","Type":"ContainerDied","Data":"61a4a40c5f8af6942fd47a7520e6d578dc12eab153dea85830188e0ea496748b"} Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.828109 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.828119 4799 scope.go:117] "RemoveContainer" containerID="8c00e5f00da3f3438e958825417727801c2e375dca1b9bbb65ae4b86343917f7" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.878853 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.885720 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.896947 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.932192 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:25:58 crc kubenswrapper[4799]: E0319 20:25:58.932676 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1392f44b-030e-4305-aa1e-14d89e1696db" containerName="kube-state-metrics" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.932701 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="1392f44b-030e-4305-aa1e-14d89e1696db" containerName="kube-state-metrics" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.932945 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="1392f44b-030e-4305-aa1e-14d89e1696db" containerName="kube-state-metrics" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.933765 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.936192 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.936262 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 19 20:25:58 crc kubenswrapper[4799]: I0319 20:25:58.955795 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.087232 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.087326 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.087447 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdlmr\" (UniqueName: \"kubernetes.io/projected/50808de7-9788-451a-8910-6be8f217ae09-kube-api-access-tdlmr\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.087499 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.138680 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1392f44b-030e-4305-aa1e-14d89e1696db" path="/var/lib/kubelet/pods/1392f44b-030e-4305-aa1e-14d89e1696db/volumes" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.189440 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.189542 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.189660 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdlmr\" (UniqueName: \"kubernetes.io/projected/50808de7-9788-451a-8910-6be8f217ae09-kube-api-access-tdlmr\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.189725 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.196340 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.196652 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.204071 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/50808de7-9788-451a-8910-6be8f217ae09-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.210827 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdlmr\" (UniqueName: \"kubernetes.io/projected/50808de7-9788-451a-8910-6be8f217ae09-kube-api-access-tdlmr\") pod \"kube-state-metrics-0\" (UID: \"50808de7-9788-451a-8910-6be8f217ae09\") " pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.258991 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.594470 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.595399 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-central-agent" containerID="cri-o://6f627caa8f5379558b680834e38629d020e5d8166f8afd16ddc1993e618e9d4e" gracePeriod=30 Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.595820 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="proxy-httpd" containerID="cri-o://17405a49c8b2d234457da915034c70d5143d65c907f5f5ea820e5a3c6fa8891e" gracePeriod=30 Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.595873 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="sg-core" containerID="cri-o://f0144add50ea70b32cd098981d335a8b5249f18fddca693f0a655ea364a2319b" gracePeriod=30 Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.595946 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-notification-agent" containerID="cri-o://25597ba4ff789adabd2dd2fed33956f15575ec41234dea7f8d03cd5cf66ac66d" gracePeriod=30 Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.793150 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 20:25:59 crc kubenswrapper[4799]: W0319 20:25:59.799132 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50808de7_9788_451a_8910_6be8f217ae09.slice/crio-cc76264adb43df71afe70dacd4051050ffff0046e2b264f937d547b51fe7694d WatchSource:0}: Error finding container cc76264adb43df71afe70dacd4051050ffff0046e2b264f937d547b51fe7694d: Status 404 returned error can't find the container with id cc76264adb43df71afe70dacd4051050ffff0046e2b264f937d547b51fe7694d Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.838501 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50808de7-9788-451a-8910-6be8f217ae09","Type":"ContainerStarted","Data":"cc76264adb43df71afe70dacd4051050ffff0046e2b264f937d547b51fe7694d"} Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.844548 4799 generic.go:334] "Generic (PLEG): container finished" podID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerID="17405a49c8b2d234457da915034c70d5143d65c907f5f5ea820e5a3c6fa8891e" exitCode=0 Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.844589 4799 generic.go:334] "Generic (PLEG): container finished" podID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerID="f0144add50ea70b32cd098981d335a8b5249f18fddca693f0a655ea364a2319b" exitCode=2 Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.844633 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerDied","Data":"17405a49c8b2d234457da915034c70d5143d65c907f5f5ea820e5a3c6fa8891e"} Mar 19 20:25:59 crc kubenswrapper[4799]: I0319 20:25:59.844679 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerDied","Data":"f0144add50ea70b32cd098981d335a8b5249f18fddca693f0a655ea364a2319b"} Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.131448 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565866-wxfvv"] Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.133727 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.136582 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.136616 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.138765 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.147228 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-wxfvv"] Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.210928 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tnp\" (UniqueName: \"kubernetes.io/projected/202925b0-e76d-46a3-80aa-0ac2ba9dad11-kube-api-access-k7tnp\") pod \"auto-csr-approver-29565866-wxfvv\" (UID: \"202925b0-e76d-46a3-80aa-0ac2ba9dad11\") " pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.312674 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tnp\" (UniqueName: \"kubernetes.io/projected/202925b0-e76d-46a3-80aa-0ac2ba9dad11-kube-api-access-k7tnp\") pod \"auto-csr-approver-29565866-wxfvv\" (UID: \"202925b0-e76d-46a3-80aa-0ac2ba9dad11\") " pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.339012 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tnp\" (UniqueName: \"kubernetes.io/projected/202925b0-e76d-46a3-80aa-0ac2ba9dad11-kube-api-access-k7tnp\") pod \"auto-csr-approver-29565866-wxfvv\" (UID: \"202925b0-e76d-46a3-80aa-0ac2ba9dad11\") " pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.460521 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.855013 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50808de7-9788-451a-8910-6be8f217ae09","Type":"ContainerStarted","Data":"9a37ab9a80bbf3c4337cacb8c6ceac3955540e54bbb5e5a614826b7ad466d6f7"} Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.856993 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.861816 4799 generic.go:334] "Generic (PLEG): container finished" podID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerID="6f627caa8f5379558b680834e38629d020e5d8166f8afd16ddc1993e618e9d4e" exitCode=0 Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.861861 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerDied","Data":"6f627caa8f5379558b680834e38629d020e5d8166f8afd16ddc1993e618e9d4e"} Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.879007 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.446798899 podStartE2EDuration="2.878982792s" podCreationTimestamp="2026-03-19 20:25:58 +0000 UTC" firstStartedPulling="2026-03-19 20:25:59.80248666 +0000 UTC m=+1237.408439732" lastFinishedPulling="2026-03-19 20:26:00.234670553 +0000 UTC m=+1237.840623625" observedRunningTime="2026-03-19 20:26:00.87171891 +0000 UTC m=+1238.477671992" watchObservedRunningTime="2026-03-19 20:26:00.878982792 +0000 UTC m=+1238.484935874" Mar 19 20:26:00 crc kubenswrapper[4799]: W0319 20:26:00.952406 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod202925b0_e76d_46a3_80aa_0ac2ba9dad11.slice/crio-7193d8554a523ab9c325e7a290d57bcf97a87e7e7d383c6248ddcc22c0a53e55 WatchSource:0}: Error finding container 7193d8554a523ab9c325e7a290d57bcf97a87e7e7d383c6248ddcc22c0a53e55: Status 404 returned error can't find the container with id 7193d8554a523ab9c325e7a290d57bcf97a87e7e7d383c6248ddcc22c0a53e55 Mar 19 20:26:00 crc kubenswrapper[4799]: I0319 20:26:00.956368 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-wxfvv"] Mar 19 20:26:01 crc kubenswrapper[4799]: I0319 20:26:01.877605 4799 generic.go:334] "Generic (PLEG): container finished" podID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerID="25597ba4ff789adabd2dd2fed33956f15575ec41234dea7f8d03cd5cf66ac66d" exitCode=0 Mar 19 20:26:01 crc kubenswrapper[4799]: I0319 20:26:01.877659 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerDied","Data":"25597ba4ff789adabd2dd2fed33956f15575ec41234dea7f8d03cd5cf66ac66d"} Mar 19 20:26:01 crc kubenswrapper[4799]: I0319 20:26:01.880937 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" event={"ID":"202925b0-e76d-46a3-80aa-0ac2ba9dad11","Type":"ContainerStarted","Data":"7193d8554a523ab9c325e7a290d57bcf97a87e7e7d383c6248ddcc22c0a53e55"} Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.220619 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.350791 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-combined-ca-bundle\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351101 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-scripts\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351277 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-run-httpd\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351410 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-log-httpd\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351557 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-sg-core-conf-yaml\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351646 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-config-data\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351754 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mthc\" (UniqueName: \"kubernetes.io/projected/577f335a-7717-437f-af13-c8ffa4f8f4ea-kube-api-access-6mthc\") pod \"577f335a-7717-437f-af13-c8ffa4f8f4ea\" (UID: \"577f335a-7717-437f-af13-c8ffa4f8f4ea\") " Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351765 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.351884 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.352461 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.352564 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/577f335a-7717-437f-af13-c8ffa4f8f4ea-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.357165 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577f335a-7717-437f-af13-c8ffa4f8f4ea-kube-api-access-6mthc" (OuterVolumeSpecName: "kube-api-access-6mthc") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "kube-api-access-6mthc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.358537 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-scripts" (OuterVolumeSpecName: "scripts") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.390652 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.435101 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.454065 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.454101 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mthc\" (UniqueName: \"kubernetes.io/projected/577f335a-7717-437f-af13-c8ffa4f8f4ea-kube-api-access-6mthc\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.454114 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.454125 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.478515 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-config-data" (OuterVolumeSpecName: "config-data") pod "577f335a-7717-437f-af13-c8ffa4f8f4ea" (UID: "577f335a-7717-437f-af13-c8ffa4f8f4ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.556446 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577f335a-7717-437f-af13-c8ffa4f8f4ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.894523 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.896729 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"577f335a-7717-437f-af13-c8ffa4f8f4ea","Type":"ContainerDied","Data":"6294282a2f4c313564b449931967d6b48502f0c79c191c77026ad811429a1e36"} Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.897012 4799 scope.go:117] "RemoveContainer" containerID="17405a49c8b2d234457da915034c70d5143d65c907f5f5ea820e5a3c6fa8891e" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.898333 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" event={"ID":"202925b0-e76d-46a3-80aa-0ac2ba9dad11","Type":"ContainerStarted","Data":"b95b2d82ab6cd824f0ca6556956b4abcf544403b60629e3b3c60b38beb8af097"} Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.918142 4799 scope.go:117] "RemoveContainer" containerID="f0144add50ea70b32cd098981d335a8b5249f18fddca693f0a655ea364a2319b" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.920528 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" podStartSLOduration=1.364074819 podStartE2EDuration="2.920510844s" podCreationTimestamp="2026-03-19 20:26:00 +0000 UTC" firstStartedPulling="2026-03-19 20:26:00.956232923 +0000 UTC m=+1238.562186005" lastFinishedPulling="2026-03-19 20:26:02.512668958 +0000 UTC m=+1240.118622030" observedRunningTime="2026-03-19 20:26:02.918645232 +0000 UTC m=+1240.524598304" watchObservedRunningTime="2026-03-19 20:26:02.920510844 +0000 UTC m=+1240.526463916" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.952207 4799 scope.go:117] "RemoveContainer" containerID="25597ba4ff789adabd2dd2fed33956f15575ec41234dea7f8d03cd5cf66ac66d" Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.972971 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.990091 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:02 crc kubenswrapper[4799]: I0319 20:26:02.990174 4799 scope.go:117] "RemoveContainer" containerID="6f627caa8f5379558b680834e38629d020e5d8166f8afd16ddc1993e618e9d4e" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.008559 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:03 crc kubenswrapper[4799]: E0319 20:26:03.009093 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-notification-agent" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009116 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-notification-agent" Mar 19 20:26:03 crc kubenswrapper[4799]: E0319 20:26:03.009134 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="sg-core" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009143 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="sg-core" Mar 19 20:26:03 crc kubenswrapper[4799]: E0319 20:26:03.009173 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-central-agent" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009184 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-central-agent" Mar 19 20:26:03 crc kubenswrapper[4799]: E0319 20:26:03.009196 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="proxy-httpd" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009204 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="proxy-httpd" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009479 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="proxy-httpd" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009496 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-notification-agent" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009518 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="sg-core" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.009537 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" containerName="ceilometer-central-agent" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.011705 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.017140 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.017226 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.017548 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.038185 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.155437 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577f335a-7717-437f-af13-c8ffa4f8f4ea" path="/var/lib/kubelet/pods/577f335a-7717-437f-af13-c8ffa4f8f4ea/volumes" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.170516 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.170978 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-config-data\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: E0319 20:26:03.170538 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod202925b0_e76d_46a3_80aa_0ac2ba9dad11.slice/crio-b95b2d82ab6cd824f0ca6556956b4abcf544403b60629e3b3c60b38beb8af097.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod202925b0_e76d_46a3_80aa_0ac2ba9dad11.slice/crio-conmon-b95b2d82ab6cd824f0ca6556956b4abcf544403b60629e3b3c60b38beb8af097.scope\": RecentStats: unable to find data in memory cache]" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.171168 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-scripts\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.171315 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.171457 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.172018 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-run-httpd\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.172136 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p74vx\" (UniqueName: \"kubernetes.io/projected/637b3c17-830d-4bb6-a678-2592608fcf9a-kube-api-access-p74vx\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.172298 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-log-httpd\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.274225 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-run-httpd\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.274669 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-run-httpd\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.274785 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p74vx\" (UniqueName: \"kubernetes.io/projected/637b3c17-830d-4bb6-a678-2592608fcf9a-kube-api-access-p74vx\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.274922 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-log-httpd\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.275157 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.275248 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-config-data\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.275331 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-scripts\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.275459 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.275546 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.275268 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-log-httpd\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.280066 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.280677 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.280799 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.281009 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-config-data\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.281110 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-scripts\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.289807 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p74vx\" (UniqueName: \"kubernetes.io/projected/637b3c17-830d-4bb6-a678-2592608fcf9a-kube-api-access-p74vx\") pod \"ceilometer-0\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.336538 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.534640 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.534964 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.847883 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.909947 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerStarted","Data":"0e17f2e1a6f2424d0d48129cc82c9e4fce4e1c6a2cff074f8b88e181cd44cd6f"} Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.913104 4799 generic.go:334] "Generic (PLEG): container finished" podID="202925b0-e76d-46a3-80aa-0ac2ba9dad11" containerID="b95b2d82ab6cd824f0ca6556956b4abcf544403b60629e3b3c60b38beb8af097" exitCode=0 Mar 19 20:26:03 crc kubenswrapper[4799]: I0319 20:26:03.913153 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" event={"ID":"202925b0-e76d-46a3-80aa-0ac2ba9dad11","Type":"ContainerDied","Data":"b95b2d82ab6cd824f0ca6556956b4abcf544403b60629e3b3c60b38beb8af097"} Mar 19 20:26:04 crc kubenswrapper[4799]: I0319 20:26:04.575687 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:26:04 crc kubenswrapper[4799]: I0319 20:26:04.616669 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.192564 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.192832 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.379993 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.538083 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7tnp\" (UniqueName: \"kubernetes.io/projected/202925b0-e76d-46a3-80aa-0ac2ba9dad11-kube-api-access-k7tnp\") pod \"202925b0-e76d-46a3-80aa-0ac2ba9dad11\" (UID: \"202925b0-e76d-46a3-80aa-0ac2ba9dad11\") " Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.549140 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202925b0-e76d-46a3-80aa-0ac2ba9dad11-kube-api-access-k7tnp" (OuterVolumeSpecName: "kube-api-access-k7tnp") pod "202925b0-e76d-46a3-80aa-0ac2ba9dad11" (UID: "202925b0-e76d-46a3-80aa-0ac2ba9dad11"). InnerVolumeSpecName "kube-api-access-k7tnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.640964 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7tnp\" (UniqueName: \"kubernetes.io/projected/202925b0-e76d-46a3-80aa-0ac2ba9dad11-kube-api-access-k7tnp\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.953784 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerStarted","Data":"a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498"} Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.953830 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerStarted","Data":"aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1"} Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.958058 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" event={"ID":"202925b0-e76d-46a3-80aa-0ac2ba9dad11","Type":"ContainerDied","Data":"7193d8554a523ab9c325e7a290d57bcf97a87e7e7d383c6248ddcc22c0a53e55"} Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.958104 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7193d8554a523ab9c325e7a290d57bcf97a87e7e7d383c6248ddcc22c0a53e55" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.958163 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-wxfvv" Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.988527 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-pf9sc"] Mar 19 20:26:05 crc kubenswrapper[4799]: I0319 20:26:05.998147 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-pf9sc"] Mar 19 20:26:06 crc kubenswrapper[4799]: I0319 20:26:06.972487 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerStarted","Data":"575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f"} Mar 19 20:26:07 crc kubenswrapper[4799]: I0319 20:26:07.129138 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fed39f2-bbdc-492f-be94-cdde1c1798ed" path="/var/lib/kubelet/pods/2fed39f2-bbdc-492f-be94-cdde1c1798ed/volumes" Mar 19 20:26:07 crc kubenswrapper[4799]: I0319 20:26:07.203953 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 20:26:07 crc kubenswrapper[4799]: I0319 20:26:07.215504 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 20:26:07 crc kubenswrapper[4799]: I0319 20:26:07.217892 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 20:26:07 crc kubenswrapper[4799]: I0319 20:26:07.989696 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 20:26:08 crc kubenswrapper[4799]: I0319 20:26:08.991235 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerStarted","Data":"e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4"} Mar 19 20:26:09 crc kubenswrapper[4799]: I0319 20:26:09.018600 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.589488383 podStartE2EDuration="7.018581191s" podCreationTimestamp="2026-03-19 20:26:02 +0000 UTC" firstStartedPulling="2026-03-19 20:26:03.862633885 +0000 UTC m=+1241.468586957" lastFinishedPulling="2026-03-19 20:26:08.291726683 +0000 UTC m=+1245.897679765" observedRunningTime="2026-03-19 20:26:09.01531356 +0000 UTC m=+1246.621266652" watchObservedRunningTime="2026-03-19 20:26:09.018581191 +0000 UTC m=+1246.624534263" Mar 19 20:26:09 crc kubenswrapper[4799]: I0319 20:26:09.276720 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 20:26:10 crc kubenswrapper[4799]: I0319 20:26:10.000763 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:26:10 crc kubenswrapper[4799]: I0319 20:26:10.975551 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.020200 4799 generic.go:334] "Generic (PLEG): container finished" podID="671cfbdc-6f8c-452e-a771-6a762660c355" containerID="74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030" exitCode=137 Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.021473 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.021971 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"671cfbdc-6f8c-452e-a771-6a762660c355","Type":"ContainerDied","Data":"74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030"} Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.022006 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"671cfbdc-6f8c-452e-a771-6a762660c355","Type":"ContainerDied","Data":"97a1aae30921f5166f53b4c519d6b29b9890ff75fee3365577c30ed80b5910c6"} Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.022029 4799 scope.go:117] "RemoveContainer" containerID="74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.055689 4799 scope.go:117] "RemoveContainer" containerID="74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030" Mar 19 20:26:11 crc kubenswrapper[4799]: E0319 20:26:11.056216 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030\": container with ID starting with 74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030 not found: ID does not exist" containerID="74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.056266 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030"} err="failed to get container status \"74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030\": rpc error: code = NotFound desc = could not find container \"74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030\": container with ID starting with 74508c8cc11f4cd74b6b221e9e15edc5ece09742a8adb56f1a869cf31b80f030 not found: ID does not exist" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.070681 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2r9t\" (UniqueName: \"kubernetes.io/projected/671cfbdc-6f8c-452e-a771-6a762660c355-kube-api-access-v2r9t\") pod \"671cfbdc-6f8c-452e-a771-6a762660c355\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.070752 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-config-data\") pod \"671cfbdc-6f8c-452e-a771-6a762660c355\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.070810 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-combined-ca-bundle\") pod \"671cfbdc-6f8c-452e-a771-6a762660c355\" (UID: \"671cfbdc-6f8c-452e-a771-6a762660c355\") " Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.077357 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671cfbdc-6f8c-452e-a771-6a762660c355-kube-api-access-v2r9t" (OuterVolumeSpecName: "kube-api-access-v2r9t") pod "671cfbdc-6f8c-452e-a771-6a762660c355" (UID: "671cfbdc-6f8c-452e-a771-6a762660c355"). InnerVolumeSpecName "kube-api-access-v2r9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.099840 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "671cfbdc-6f8c-452e-a771-6a762660c355" (UID: "671cfbdc-6f8c-452e-a771-6a762660c355"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.107647 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-config-data" (OuterVolumeSpecName: "config-data") pod "671cfbdc-6f8c-452e-a771-6a762660c355" (UID: "671cfbdc-6f8c-452e-a771-6a762660c355"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.172967 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2r9t\" (UniqueName: \"kubernetes.io/projected/671cfbdc-6f8c-452e-a771-6a762660c355-kube-api-access-v2r9t\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.172992 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.173002 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671cfbdc-6f8c-452e-a771-6a762660c355-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.365132 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.383413 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.404051 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:26:11 crc kubenswrapper[4799]: E0319 20:26:11.405656 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671cfbdc-6f8c-452e-a771-6a762660c355" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.405691 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="671cfbdc-6f8c-452e-a771-6a762660c355" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 20:26:11 crc kubenswrapper[4799]: E0319 20:26:11.405733 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202925b0-e76d-46a3-80aa-0ac2ba9dad11" containerName="oc" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.405742 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="202925b0-e76d-46a3-80aa-0ac2ba9dad11" containerName="oc" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.406066 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="671cfbdc-6f8c-452e-a771-6a762660c355" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.406090 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="202925b0-e76d-46a3-80aa-0ac2ba9dad11" containerName="oc" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.406910 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.412996 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.413193 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.413359 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.427544 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.477814 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.477919 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.478019 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvfdq\" (UniqueName: \"kubernetes.io/projected/e45143c6-a88e-40c6-a6fc-5452da3be735-kube-api-access-lvfdq\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.478065 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.478094 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.534911 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.534960 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.580209 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvfdq\" (UniqueName: \"kubernetes.io/projected/e45143c6-a88e-40c6-a6fc-5452da3be735-kube-api-access-lvfdq\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.580253 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.580279 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.580354 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.580426 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.585893 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.586748 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.587689 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.593343 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45143c6-a88e-40c6-a6fc-5452da3be735-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.597710 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvfdq\" (UniqueName: \"kubernetes.io/projected/e45143c6-a88e-40c6-a6fc-5452da3be735-kube-api-access-lvfdq\") pod \"nova-cell1-novncproxy-0\" (UID: \"e45143c6-a88e-40c6-a6fc-5452da3be735\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:11 crc kubenswrapper[4799]: I0319 20:26:11.728987 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:12 crc kubenswrapper[4799]: I0319 20:26:12.241714 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.048691 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e45143c6-a88e-40c6-a6fc-5452da3be735","Type":"ContainerStarted","Data":"bba927e71121c111bddcda0d158d770e9c116fe1da8273aa89edc76eadfb0094"} Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.049462 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e45143c6-a88e-40c6-a6fc-5452da3be735","Type":"ContainerStarted","Data":"f044180121693f319df7b705e59ff6e9bd5f3a0247b01acf4a5292f36863fc77"} Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.083555 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.08352548 podStartE2EDuration="2.08352548s" podCreationTimestamp="2026-03-19 20:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:13.073266194 +0000 UTC m=+1250.679219326" watchObservedRunningTime="2026-03-19 20:26:13.08352548 +0000 UTC m=+1250.689478592" Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.143220 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671cfbdc-6f8c-452e-a771-6a762660c355" path="/var/lib/kubelet/pods/671cfbdc-6f8c-452e-a771-6a762660c355/volumes" Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.539185 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.540086 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 20:26:13 crc kubenswrapper[4799]: I0319 20:26:13.542579 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.061966 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.274863 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-ccgcs"] Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.276790 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.291665 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-ccgcs"] Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.341021 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.341416 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.341459 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-config\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.341563 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.341651 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-svc\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.341682 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgcj\" (UniqueName: \"kubernetes.io/projected/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-kube-api-access-bdgcj\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.443752 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.443835 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.443877 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-config\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.443930 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.443955 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-svc\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.443970 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgcj\" (UniqueName: \"kubernetes.io/projected/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-kube-api-access-bdgcj\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.444958 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.445226 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-svc\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.445307 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-config\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.445478 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.445491 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.464628 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgcj\" (UniqueName: \"kubernetes.io/projected/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-kube-api-access-bdgcj\") pod \"dnsmasq-dns-7b6d8fd79c-ccgcs\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:14 crc kubenswrapper[4799]: I0319 20:26:14.606684 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:15 crc kubenswrapper[4799]: I0319 20:26:15.112780 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-ccgcs"] Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.094740 4799 generic.go:334] "Generic (PLEG): container finished" podID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerID="41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be" exitCode=0 Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.094867 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" event={"ID":"08ab47f7-ed2d-457a-9b03-31a54cd2f62e","Type":"ContainerDied","Data":"41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be"} Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.095078 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" event={"ID":"08ab47f7-ed2d-457a-9b03-31a54cd2f62e","Type":"ContainerStarted","Data":"07c0e08bbdff528858aca040c950372f8217da76f222008fdc9be049cf9745dd"} Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.344323 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.344901 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-central-agent" containerID="cri-o://aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1" gracePeriod=30 Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.345015 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-notification-agent" containerID="cri-o://a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498" gracePeriod=30 Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.345043 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="sg-core" containerID="cri-o://575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f" gracePeriod=30 Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.345183 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="proxy-httpd" containerID="cri-o://e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4" gracePeriod=30 Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.574564 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:16 crc kubenswrapper[4799]: I0319 20:26:16.729911 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.107434 4799 generic.go:334] "Generic (PLEG): container finished" podID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerID="e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4" exitCode=0 Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.107795 4799 generic.go:334] "Generic (PLEG): container finished" podID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerID="575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f" exitCode=2 Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.107811 4799 generic.go:334] "Generic (PLEG): container finished" podID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerID="aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1" exitCode=0 Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.107479 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerDied","Data":"e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4"} Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.107916 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerDied","Data":"575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f"} Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.107934 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerDied","Data":"aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1"} Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.111990 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" event={"ID":"08ab47f7-ed2d-457a-9b03-31a54cd2f62e","Type":"ContainerStarted","Data":"f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2"} Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.112157 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.112742 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-log" containerID="cri-o://2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e" gracePeriod=30 Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.112942 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-api" containerID="cri-o://5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79" gracePeriod=30 Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.145689 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" podStartSLOduration=3.145665501 podStartE2EDuration="3.145665501s" podCreationTimestamp="2026-03-19 20:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:17.139714476 +0000 UTC m=+1254.745667568" watchObservedRunningTime="2026-03-19 20:26:17.145665501 +0000 UTC m=+1254.751618583" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.859883 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904329 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p74vx\" (UniqueName: \"kubernetes.io/projected/637b3c17-830d-4bb6-a678-2592608fcf9a-kube-api-access-p74vx\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904422 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-config-data\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904470 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-sg-core-conf-yaml\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904531 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-combined-ca-bundle\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904557 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-ceilometer-tls-certs\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904595 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-log-httpd\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904621 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-run-httpd\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.904779 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-scripts\") pod \"637b3c17-830d-4bb6-a678-2592608fcf9a\" (UID: \"637b3c17-830d-4bb6-a678-2592608fcf9a\") " Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.910463 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.910721 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.911431 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-scripts" (OuterVolumeSpecName: "scripts") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.914596 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637b3c17-830d-4bb6-a678-2592608fcf9a-kube-api-access-p74vx" (OuterVolumeSpecName: "kube-api-access-p74vx") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "kube-api-access-p74vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.952982 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:17 crc kubenswrapper[4799]: I0319 20:26:17.981902 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.001167 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.003642 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-config-data" (OuterVolumeSpecName: "config-data") pod "637b3c17-830d-4bb6-a678-2592608fcf9a" (UID: "637b3c17-830d-4bb6-a678-2592608fcf9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007078 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007119 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p74vx\" (UniqueName: \"kubernetes.io/projected/637b3c17-830d-4bb6-a678-2592608fcf9a-kube-api-access-p74vx\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007131 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007140 4799 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007148 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007156 4799 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/637b3c17-830d-4bb6-a678-2592608fcf9a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007165 4799 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.007172 4799 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/637b3c17-830d-4bb6-a678-2592608fcf9a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.128531 4799 generic.go:334] "Generic (PLEG): container finished" podID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerID="a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498" exitCode=0 Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.128605 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerDied","Data":"a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498"} Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.128769 4799 scope.go:117] "RemoveContainer" containerID="e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.128775 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"637b3c17-830d-4bb6-a678-2592608fcf9a","Type":"ContainerDied","Data":"0e17f2e1a6f2424d0d48129cc82c9e4fce4e1c6a2cff074f8b88e181cd44cd6f"} Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.129374 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.131086 4799 generic.go:334] "Generic (PLEG): container finished" podID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerID="2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e" exitCode=143 Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.131163 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1","Type":"ContainerDied","Data":"2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e"} Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.152478 4799 scope.go:117] "RemoveContainer" containerID="575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.170038 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.175483 4799 scope.go:117] "RemoveContainer" containerID="a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.184711 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.205662 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.206145 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="proxy-httpd" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206168 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="proxy-httpd" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.206197 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-notification-agent" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206207 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-notification-agent" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.206225 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-central-agent" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206233 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-central-agent" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.206255 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="sg-core" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206262 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="sg-core" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206534 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="sg-core" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206563 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-central-agent" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206583 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="proxy-httpd" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.206597 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" containerName="ceilometer-notification-agent" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.221150 4799 scope.go:117] "RemoveContainer" containerID="aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.227725 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.228017 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.237114 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.244298 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.244583 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.251908 4799 scope.go:117] "RemoveContainer" containerID="e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.252766 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4\": container with ID starting with e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4 not found: ID does not exist" containerID="e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.252805 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4"} err="failed to get container status \"e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4\": rpc error: code = NotFound desc = could not find container \"e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4\": container with ID starting with e66bcf610c2c5c43c496447dab34c7a32bb6888d563babf0d11f7dcc6878dcb4 not found: ID does not exist" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.252913 4799 scope.go:117] "RemoveContainer" containerID="575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.253216 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f\": container with ID starting with 575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f not found: ID does not exist" containerID="575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.253247 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f"} err="failed to get container status \"575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f\": rpc error: code = NotFound desc = could not find container \"575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f\": container with ID starting with 575a0d64ea26ac329e0834b762175cbb01b4c4af34dfe5669fe0c10ea1e4082f not found: ID does not exist" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.253264 4799 scope.go:117] "RemoveContainer" containerID="a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.254683 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498\": container with ID starting with a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498 not found: ID does not exist" containerID="a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.254742 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498"} err="failed to get container status \"a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498\": rpc error: code = NotFound desc = could not find container \"a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498\": container with ID starting with a1ccffaedf2ab1624863669c5060b2327c010fbc8dfe47ed6ac3b01bd62db498 not found: ID does not exist" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.254776 4799 scope.go:117] "RemoveContainer" containerID="aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1" Mar 19 20:26:18 crc kubenswrapper[4799]: E0319 20:26:18.255220 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1\": container with ID starting with aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1 not found: ID does not exist" containerID="aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.255249 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1"} err="failed to get container status \"aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1\": rpc error: code = NotFound desc = could not find container \"aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1\": container with ID starting with aaa8f1d4a90ed696e097e4aa8b54bae56afce7ff6ba328115787d77faa4638e1 not found: ID does not exist" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.311997 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312071 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-scripts\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312120 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509e207f-9e25-4446-a56c-871da702f099-run-httpd\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312139 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ksll\" (UniqueName: \"kubernetes.io/projected/509e207f-9e25-4446-a56c-871da702f099-kube-api-access-8ksll\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312335 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509e207f-9e25-4446-a56c-871da702f099-log-httpd\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312569 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-config-data\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312694 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.312755 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414593 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509e207f-9e25-4446-a56c-871da702f099-log-httpd\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414681 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-config-data\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414718 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414750 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414791 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414838 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-scripts\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414877 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509e207f-9e25-4446-a56c-871da702f099-run-httpd\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.414900 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ksll\" (UniqueName: \"kubernetes.io/projected/509e207f-9e25-4446-a56c-871da702f099-kube-api-access-8ksll\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.415877 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509e207f-9e25-4446-a56c-871da702f099-log-httpd\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.416929 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/509e207f-9e25-4446-a56c-871da702f099-run-httpd\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.423026 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.423125 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.423497 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.423716 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-config-data\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.424807 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509e207f-9e25-4446-a56c-871da702f099-scripts\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.432899 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ksll\" (UniqueName: \"kubernetes.io/projected/509e207f-9e25-4446-a56c-871da702f099-kube-api-access-8ksll\") pod \"ceilometer-0\" (UID: \"509e207f-9e25-4446-a56c-871da702f099\") " pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.565312 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 20:26:18 crc kubenswrapper[4799]: I0319 20:26:18.837467 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 20:26:19 crc kubenswrapper[4799]: I0319 20:26:19.128074 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637b3c17-830d-4bb6-a678-2592608fcf9a" path="/var/lib/kubelet/pods/637b3c17-830d-4bb6-a678-2592608fcf9a/volumes" Mar 19 20:26:19 crc kubenswrapper[4799]: I0319 20:26:19.144445 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509e207f-9e25-4446-a56c-871da702f099","Type":"ContainerStarted","Data":"94a1b62da8387d0d61a62d6e74b151f3c68536265011a42535188a4203fca41c"} Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.162217 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509e207f-9e25-4446-a56c-871da702f099","Type":"ContainerStarted","Data":"cea69817ebff95d75f87165778d03eb7210e6ba6b5c77bd8454380cf152c4c24"} Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.705232 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.907579 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle\") pod \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.907724 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-config-data\") pod \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.907783 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtrdd\" (UniqueName: \"kubernetes.io/projected/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-kube-api-access-qtrdd\") pod \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.907863 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-logs\") pod \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.908450 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-logs" (OuterVolumeSpecName: "logs") pod "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" (UID: "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.908789 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.913976 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-kube-api-access-qtrdd" (OuterVolumeSpecName: "kube-api-access-qtrdd") pod "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" (UID: "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1"). InnerVolumeSpecName "kube-api-access-qtrdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:20 crc kubenswrapper[4799]: E0319 20:26:20.937771 4799 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle podName:9cb125fe-e5f3-4190-80b1-fc10fdfee7d1 nodeName:}" failed. No retries permitted until 2026-03-19 20:26:21.437745453 +0000 UTC m=+1259.043698525 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle") pod "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" (UID: "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1") : error deleting /var/lib/kubelet/pods/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1/volume-subpaths: remove /var/lib/kubelet/pods/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1/volume-subpaths: no such file or directory Mar 19 20:26:20 crc kubenswrapper[4799]: I0319 20:26:20.944866 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-config-data" (OuterVolumeSpecName: "config-data") pod "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" (UID: "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.011037 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.011408 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtrdd\" (UniqueName: \"kubernetes.io/projected/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-kube-api-access-qtrdd\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.177978 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509e207f-9e25-4446-a56c-871da702f099","Type":"ContainerStarted","Data":"7bf56688b38dfea25129ec0e35070d7fb38edb4f7a29d333968b07ed28a51c46"} Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.178064 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509e207f-9e25-4446-a56c-871da702f099","Type":"ContainerStarted","Data":"ff05d0352e6278c848589aa7cdca161f351ab59101caa7fdb6b01e980db1f397"} Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.181156 4799 generic.go:334] "Generic (PLEG): container finished" podID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerID="5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79" exitCode=0 Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.181206 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1","Type":"ContainerDied","Data":"5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79"} Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.181238 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1","Type":"ContainerDied","Data":"aa7081ac2154da179f7122da8b260fea09490c151d41220c80c11b53adcecf1b"} Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.181418 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.181493 4799 scope.go:117] "RemoveContainer" containerID="5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.200885 4799 scope.go:117] "RemoveContainer" containerID="2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.237020 4799 scope.go:117] "RemoveContainer" containerID="5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79" Mar 19 20:26:21 crc kubenswrapper[4799]: E0319 20:26:21.237552 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79\": container with ID starting with 5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79 not found: ID does not exist" containerID="5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.237595 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79"} err="failed to get container status \"5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79\": rpc error: code = NotFound desc = could not find container \"5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79\": container with ID starting with 5e99c46363f4f4cd51daf9efeefdf776720da7ba128581a81adfe76f4e4c4c79 not found: ID does not exist" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.237620 4799 scope.go:117] "RemoveContainer" containerID="2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e" Mar 19 20:26:21 crc kubenswrapper[4799]: E0319 20:26:21.237906 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e\": container with ID starting with 2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e not found: ID does not exist" containerID="2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.237935 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e"} err="failed to get container status \"2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e\": rpc error: code = NotFound desc = could not find container \"2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e\": container with ID starting with 2cfee01d2de0da91a9998c38e3bc2d0620a4366c4570803e8933032e0fabae3e not found: ID does not exist" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.521025 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle\") pod \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\" (UID: \"9cb125fe-e5f3-4190-80b1-fc10fdfee7d1\") " Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.525776 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" (UID: "9cb125fe-e5f3-4190-80b1-fc10fdfee7d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.623292 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.729341 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.765590 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.825085 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.840550 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.854960 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:21 crc kubenswrapper[4799]: E0319 20:26:21.855357 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-api" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.855396 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-api" Mar 19 20:26:21 crc kubenswrapper[4799]: E0319 20:26:21.855421 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-log" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.855430 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-log" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.855673 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-api" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.855697 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" containerName="nova-api-log" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.856779 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.859802 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.860265 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.864251 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 20:26:21 crc kubenswrapper[4799]: I0319 20:26:21.866635 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.029772 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56pd\" (UniqueName: \"kubernetes.io/projected/f666539f-aa23-40f9-b6cb-6cd15d8c1729-kube-api-access-j56pd\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.029883 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.029905 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666539f-aa23-40f9-b6cb-6cd15d8c1729-logs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.029948 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.029965 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-config-data\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.029988 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-public-tls-certs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.132567 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.132607 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-config-data\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.132632 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-public-tls-certs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.132728 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56pd\" (UniqueName: \"kubernetes.io/projected/f666539f-aa23-40f9-b6cb-6cd15d8c1729-kube-api-access-j56pd\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.132842 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.132863 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666539f-aa23-40f9-b6cb-6cd15d8c1729-logs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.133239 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666539f-aa23-40f9-b6cb-6cd15d8c1729-logs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.138339 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.138413 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.139165 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-public-tls-certs\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.139880 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-config-data\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.164773 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56pd\" (UniqueName: \"kubernetes.io/projected/f666539f-aa23-40f9-b6cb-6cd15d8c1729-kube-api-access-j56pd\") pod \"nova-api-0\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.172895 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.215701 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.435316 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-c8765"] Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.437031 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.439280 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-config-data\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.439438 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.439462 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxj9q\" (UniqueName: \"kubernetes.io/projected/8149a1d2-5a92-4549-921c-c6a14131c0c7-kube-api-access-lxj9q\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.439480 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-scripts\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.441418 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.446880 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.453101 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-c8765"] Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.540358 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.540414 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxj9q\" (UniqueName: \"kubernetes.io/projected/8149a1d2-5a92-4549-921c-c6a14131c0c7-kube-api-access-lxj9q\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.540432 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-scripts\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.540453 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-config-data\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.545698 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-scripts\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.548952 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-config-data\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.557063 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.560155 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxj9q\" (UniqueName: \"kubernetes.io/projected/8149a1d2-5a92-4549-921c-c6a14131c0c7-kube-api-access-lxj9q\") pod \"nova-cell1-cell-mapping-c8765\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.711801 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:22 crc kubenswrapper[4799]: I0319 20:26:22.766118 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:23 crc kubenswrapper[4799]: I0319 20:26:23.128057 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb125fe-e5f3-4190-80b1-fc10fdfee7d1" path="/var/lib/kubelet/pods/9cb125fe-e5f3-4190-80b1-fc10fdfee7d1/volumes" Mar 19 20:26:23 crc kubenswrapper[4799]: I0319 20:26:23.201813 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f666539f-aa23-40f9-b6cb-6cd15d8c1729","Type":"ContainerStarted","Data":"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985"} Mar 19 20:26:23 crc kubenswrapper[4799]: I0319 20:26:23.201865 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f666539f-aa23-40f9-b6cb-6cd15d8c1729","Type":"ContainerStarted","Data":"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220"} Mar 19 20:26:23 crc kubenswrapper[4799]: I0319 20:26:23.201879 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f666539f-aa23-40f9-b6cb-6cd15d8c1729","Type":"ContainerStarted","Data":"a68fb2683af077dca4deb960508e2d839357358b17af23a0bae45a717a4a8ac8"} Mar 19 20:26:23 crc kubenswrapper[4799]: I0319 20:26:23.236822 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-c8765"] Mar 19 20:26:23 crc kubenswrapper[4799]: I0319 20:26:23.237903 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.237021951 podStartE2EDuration="2.237021951s" podCreationTimestamp="2026-03-19 20:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:23.226272972 +0000 UTC m=+1260.832226064" watchObservedRunningTime="2026-03-19 20:26:23.237021951 +0000 UTC m=+1260.842975023" Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.215510 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c8765" event={"ID":"8149a1d2-5a92-4549-921c-c6a14131c0c7","Type":"ContainerStarted","Data":"8ce6a70a5e04e3d4f7c136b156a0d9a979f5cb3eef6074f7c3511c7bdb2ac4c6"} Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.216173 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c8765" event={"ID":"8149a1d2-5a92-4549-921c-c6a14131c0c7","Type":"ContainerStarted","Data":"aa79f0a767afb86fd3691039c58c8c21c29f037344371662e145c7e207cb54f8"} Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.226011 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"509e207f-9e25-4446-a56c-871da702f099","Type":"ContainerStarted","Data":"5f47df59aea1e415720573710d4d76cf3881a02eefaa9690932cf7e08382f052"} Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.226461 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.253958 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-c8765" podStartSLOduration=2.253930215 podStartE2EDuration="2.253930215s" podCreationTimestamp="2026-03-19 20:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:24.251145997 +0000 UTC m=+1261.857099169" watchObservedRunningTime="2026-03-19 20:26:24.253930215 +0000 UTC m=+1261.859883327" Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.315523 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.876196678 podStartE2EDuration="6.315495659s" podCreationTimestamp="2026-03-19 20:26:18 +0000 UTC" firstStartedPulling="2026-03-19 20:26:18.847447164 +0000 UTC m=+1256.453400236" lastFinishedPulling="2026-03-19 20:26:23.286746135 +0000 UTC m=+1260.892699217" observedRunningTime="2026-03-19 20:26:24.305537312 +0000 UTC m=+1261.911490384" watchObservedRunningTime="2026-03-19 20:26:24.315495659 +0000 UTC m=+1261.921448741" Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.607718 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.692444 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-gdxrf"] Mar 19 20:26:24 crc kubenswrapper[4799]: I0319 20:26:24.693052 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerName="dnsmasq-dns" containerID="cri-o://3975bc0e0caaa101f99848db60036ac2d9f82a22215243eb052d2a1bec787e85" gracePeriod=10 Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.245736 4799 generic.go:334] "Generic (PLEG): container finished" podID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerID="3975bc0e0caaa101f99848db60036ac2d9f82a22215243eb052d2a1bec787e85" exitCode=0 Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.246038 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" event={"ID":"eb382d8f-526e-4f8c-bc45-d1afd184fb98","Type":"ContainerDied","Data":"3975bc0e0caaa101f99848db60036ac2d9f82a22215243eb052d2a1bec787e85"} Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.246168 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" event={"ID":"eb382d8f-526e-4f8c-bc45-d1afd184fb98","Type":"ContainerDied","Data":"037b3d4e08c200dbba159fe6ec4f24cc013bef0085454f900abd417565e01274"} Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.246186 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="037b3d4e08c200dbba159fe6ec4f24cc013bef0085454f900abd417565e01274" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.260578 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.405814 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-swift-storage-0\") pod \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.405992 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-svc\") pod \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.406047 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jscqw\" (UniqueName: \"kubernetes.io/projected/eb382d8f-526e-4f8c-bc45-d1afd184fb98-kube-api-access-jscqw\") pod \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.406079 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-nb\") pod \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.406096 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-sb\") pod \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.406124 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-config\") pod \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\" (UID: \"eb382d8f-526e-4f8c-bc45-d1afd184fb98\") " Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.420610 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb382d8f-526e-4f8c-bc45-d1afd184fb98-kube-api-access-jscqw" (OuterVolumeSpecName: "kube-api-access-jscqw") pod "eb382d8f-526e-4f8c-bc45-d1afd184fb98" (UID: "eb382d8f-526e-4f8c-bc45-d1afd184fb98"). InnerVolumeSpecName "kube-api-access-jscqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.453969 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb382d8f-526e-4f8c-bc45-d1afd184fb98" (UID: "eb382d8f-526e-4f8c-bc45-d1afd184fb98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.459168 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb382d8f-526e-4f8c-bc45-d1afd184fb98" (UID: "eb382d8f-526e-4f8c-bc45-d1afd184fb98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.461904 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb382d8f-526e-4f8c-bc45-d1afd184fb98" (UID: "eb382d8f-526e-4f8c-bc45-d1afd184fb98"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.477605 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb382d8f-526e-4f8c-bc45-d1afd184fb98" (UID: "eb382d8f-526e-4f8c-bc45-d1afd184fb98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.480508 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-config" (OuterVolumeSpecName: "config") pod "eb382d8f-526e-4f8c-bc45-d1afd184fb98" (UID: "eb382d8f-526e-4f8c-bc45-d1afd184fb98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.508783 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.508816 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.508826 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jscqw\" (UniqueName: \"kubernetes.io/projected/eb382d8f-526e-4f8c-bc45-d1afd184fb98-kube-api-access-jscqw\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.508837 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.508846 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:25 crc kubenswrapper[4799]: I0319 20:26:25.508854 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb382d8f-526e-4f8c-bc45-d1afd184fb98-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:26 crc kubenswrapper[4799]: I0319 20:26:26.259945 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-gdxrf" Mar 19 20:26:26 crc kubenswrapper[4799]: I0319 20:26:26.303067 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-gdxrf"] Mar 19 20:26:26 crc kubenswrapper[4799]: I0319 20:26:26.316009 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-gdxrf"] Mar 19 20:26:27 crc kubenswrapper[4799]: I0319 20:26:27.130888 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" path="/var/lib/kubelet/pods/eb382d8f-526e-4f8c-bc45-d1afd184fb98/volumes" Mar 19 20:26:28 crc kubenswrapper[4799]: I0319 20:26:28.282452 4799 generic.go:334] "Generic (PLEG): container finished" podID="8149a1d2-5a92-4549-921c-c6a14131c0c7" containerID="8ce6a70a5e04e3d4f7c136b156a0d9a979f5cb3eef6074f7c3511c7bdb2ac4c6" exitCode=0 Mar 19 20:26:28 crc kubenswrapper[4799]: I0319 20:26:28.282503 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c8765" event={"ID":"8149a1d2-5a92-4549-921c-c6a14131c0c7","Type":"ContainerDied","Data":"8ce6a70a5e04e3d4f7c136b156a0d9a979f5cb3eef6074f7c3511c7bdb2ac4c6"} Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.765271 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.907343 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-scripts\") pod \"8149a1d2-5a92-4549-921c-c6a14131c0c7\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.907453 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-config-data\") pod \"8149a1d2-5a92-4549-921c-c6a14131c0c7\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.907507 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-combined-ca-bundle\") pod \"8149a1d2-5a92-4549-921c-c6a14131c0c7\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.907661 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxj9q\" (UniqueName: \"kubernetes.io/projected/8149a1d2-5a92-4549-921c-c6a14131c0c7-kube-api-access-lxj9q\") pod \"8149a1d2-5a92-4549-921c-c6a14131c0c7\" (UID: \"8149a1d2-5a92-4549-921c-c6a14131c0c7\") " Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.914617 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8149a1d2-5a92-4549-921c-c6a14131c0c7-kube-api-access-lxj9q" (OuterVolumeSpecName: "kube-api-access-lxj9q") pod "8149a1d2-5a92-4549-921c-c6a14131c0c7" (UID: "8149a1d2-5a92-4549-921c-c6a14131c0c7"). InnerVolumeSpecName "kube-api-access-lxj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.916069 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-scripts" (OuterVolumeSpecName: "scripts") pod "8149a1d2-5a92-4549-921c-c6a14131c0c7" (UID: "8149a1d2-5a92-4549-921c-c6a14131c0c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.945114 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8149a1d2-5a92-4549-921c-c6a14131c0c7" (UID: "8149a1d2-5a92-4549-921c-c6a14131c0c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:29 crc kubenswrapper[4799]: I0319 20:26:29.952901 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-config-data" (OuterVolumeSpecName: "config-data") pod "8149a1d2-5a92-4549-921c-c6a14131c0c7" (UID: "8149a1d2-5a92-4549-921c-c6a14131c0c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.010452 4799 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.010492 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.010506 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8149a1d2-5a92-4549-921c-c6a14131c0c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.010520 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxj9q\" (UniqueName: \"kubernetes.io/projected/8149a1d2-5a92-4549-921c-c6a14131c0c7-kube-api-access-lxj9q\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.314503 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c8765" event={"ID":"8149a1d2-5a92-4549-921c-c6a14131c0c7","Type":"ContainerDied","Data":"aa79f0a767afb86fd3691039c58c8c21c29f037344371662e145c7e207cb54f8"} Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.314959 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa79f0a767afb86fd3691039c58c8c21c29f037344371662e145c7e207cb54f8" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.314583 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c8765" Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.528588 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.528955 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e2555dac-6f8e-4fb5-9281-f21c939c8077" containerName="nova-scheduler-scheduler" containerID="cri-o://58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86" gracePeriod=30 Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.546458 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.546842 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-log" containerID="cri-o://16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220" gracePeriod=30 Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.547552 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-api" containerID="cri-o://718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985" gracePeriod=30 Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.558249 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.558558 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-log" containerID="cri-o://ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476" gracePeriod=30 Mar 19 20:26:30 crc kubenswrapper[4799]: I0319 20:26:30.559076 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-metadata" containerID="cri-o://fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c" gracePeriod=30 Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.158301 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.234738 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-internal-tls-certs\") pod \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.234989 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-config-data\") pod \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.235101 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-public-tls-certs\") pod \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.235263 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-combined-ca-bundle\") pod \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.235434 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j56pd\" (UniqueName: \"kubernetes.io/projected/f666539f-aa23-40f9-b6cb-6cd15d8c1729-kube-api-access-j56pd\") pod \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.235585 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666539f-aa23-40f9-b6cb-6cd15d8c1729-logs\") pod \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\" (UID: \"f666539f-aa23-40f9-b6cb-6cd15d8c1729\") " Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.235851 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f666539f-aa23-40f9-b6cb-6cd15d8c1729-logs" (OuterVolumeSpecName: "logs") pod "f666539f-aa23-40f9-b6cb-6cd15d8c1729" (UID: "f666539f-aa23-40f9-b6cb-6cd15d8c1729"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.236459 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666539f-aa23-40f9-b6cb-6cd15d8c1729-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.247980 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f666539f-aa23-40f9-b6cb-6cd15d8c1729-kube-api-access-j56pd" (OuterVolumeSpecName: "kube-api-access-j56pd") pod "f666539f-aa23-40f9-b6cb-6cd15d8c1729" (UID: "f666539f-aa23-40f9-b6cb-6cd15d8c1729"). InnerVolumeSpecName "kube-api-access-j56pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.333731 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-config-data" (OuterVolumeSpecName: "config-data") pod "f666539f-aa23-40f9-b6cb-6cd15d8c1729" (UID: "f666539f-aa23-40f9-b6cb-6cd15d8c1729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339473 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f666539f-aa23-40f9-b6cb-6cd15d8c1729" (UID: "f666539f-aa23-40f9-b6cb-6cd15d8c1729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339645 4799 generic.go:334] "Generic (PLEG): container finished" podID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerID="718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985" exitCode=0 Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339677 4799 generic.go:334] "Generic (PLEG): container finished" podID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerID="16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220" exitCode=143 Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339741 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f666539f-aa23-40f9-b6cb-6cd15d8c1729","Type":"ContainerDied","Data":"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985"} Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339806 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f666539f-aa23-40f9-b6cb-6cd15d8c1729","Type":"ContainerDied","Data":"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220"} Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339864 4799 scope.go:117] "RemoveContainer" containerID="718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339888 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f666539f-aa23-40f9-b6cb-6cd15d8c1729","Type":"ContainerDied","Data":"a68fb2683af077dca4deb960508e2d839357358b17af23a0bae45a717a4a8ac8"} Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.339960 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.340678 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j56pd\" (UniqueName: \"kubernetes.io/projected/f666539f-aa23-40f9-b6cb-6cd15d8c1729-kube-api-access-j56pd\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.340771 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.354643 4799 generic.go:334] "Generic (PLEG): container finished" podID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerID="ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476" exitCode=143 Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.354687 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4dda1e2e-c642-491e-ba96-26e9638bc902","Type":"ContainerDied","Data":"ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476"} Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.357150 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f666539f-aa23-40f9-b6cb-6cd15d8c1729" (UID: "f666539f-aa23-40f9-b6cb-6cd15d8c1729"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.362303 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f666539f-aa23-40f9-b6cb-6cd15d8c1729" (UID: "f666539f-aa23-40f9-b6cb-6cd15d8c1729"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.368721 4799 scope.go:117] "RemoveContainer" containerID="16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.389841 4799 scope.go:117] "RemoveContainer" containerID="718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985" Mar 19 20:26:31 crc kubenswrapper[4799]: E0319 20:26:31.390299 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985\": container with ID starting with 718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985 not found: ID does not exist" containerID="718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.390407 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985"} err="failed to get container status \"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985\": rpc error: code = NotFound desc = could not find container \"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985\": container with ID starting with 718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985 not found: ID does not exist" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.390493 4799 scope.go:117] "RemoveContainer" containerID="16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220" Mar 19 20:26:31 crc kubenswrapper[4799]: E0319 20:26:31.391449 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220\": container with ID starting with 16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220 not found: ID does not exist" containerID="16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.391481 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220"} err="failed to get container status \"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220\": rpc error: code = NotFound desc = could not find container \"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220\": container with ID starting with 16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220 not found: ID does not exist" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.391503 4799 scope.go:117] "RemoveContainer" containerID="718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.391746 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985"} err="failed to get container status \"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985\": rpc error: code = NotFound desc = could not find container \"718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985\": container with ID starting with 718edd3deed06737b1068a4de687ec486cfb14caf02c8f1031b39e256a1ce985 not found: ID does not exist" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.391818 4799 scope.go:117] "RemoveContainer" containerID="16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.392120 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220"} err="failed to get container status \"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220\": rpc error: code = NotFound desc = could not find container \"16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220\": container with ID starting with 16aff21b42cdb65c2ef08f2e86660713c250b25876514466aae86fe289177220 not found: ID does not exist" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.446740 4799 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.446794 4799 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:31 crc kubenswrapper[4799]: I0319 20:26:31.446806 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666539f-aa23-40f9-b6cb-6cd15d8c1729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.705975 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.725868 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.775840 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:31.776474 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerName="dnsmasq-dns" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776489 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerName="dnsmasq-dns" Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:31.776502 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerName="init" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776508 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerName="init" Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:31.776525 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-log" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776531 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-log" Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:31.776542 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149a1d2-5a92-4549-921c-c6a14131c0c7" containerName="nova-manage" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776547 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149a1d2-5a92-4549-921c-c6a14131c0c7" containerName="nova-manage" Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:31.776568 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-api" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776576 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-api" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776738 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-api" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776758 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149a1d2-5a92-4549-921c-c6a14131c0c7" containerName="nova-manage" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776770 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" containerName="nova-api-log" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.776783 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb382d8f-526e-4f8c-bc45-d1afd184fb98" containerName="dnsmasq-dns" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.777767 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.780062 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.780113 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.780407 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.786363 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.853816 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.853877 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-public-tls-certs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.853900 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmlv8\" (UniqueName: \"kubernetes.io/projected/f822edd3-bebb-4a8a-9755-60419527dbde-kube-api-access-hmlv8\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.853930 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-config-data\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.853970 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.854092 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f822edd3-bebb-4a8a-9755-60419527dbde-logs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.886808 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.956866 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.956917 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-public-tls-certs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.956939 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmlv8\" (UniqueName: \"kubernetes.io/projected/f822edd3-bebb-4a8a-9755-60419527dbde-kube-api-access-hmlv8\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.956964 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-config-data\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.956999 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.957102 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f822edd3-bebb-4a8a-9755-60419527dbde-logs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.958756 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f822edd3-bebb-4a8a-9755-60419527dbde-logs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.963458 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.964405 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-config-data\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.966875 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-public-tls-certs\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.971633 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f822edd3-bebb-4a8a-9755-60419527dbde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:31.976881 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmlv8\" (UniqueName: \"kubernetes.io/projected/f822edd3-bebb-4a8a-9755-60419527dbde-kube-api-access-hmlv8\") pod \"nova-api-0\" (UID: \"f822edd3-bebb-4a8a-9755-60419527dbde\") " pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.059937 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-combined-ca-bundle\") pod \"e2555dac-6f8e-4fb5-9281-f21c939c8077\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.060077 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-config-data\") pod \"e2555dac-6f8e-4fb5-9281-f21c939c8077\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.060189 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5478\" (UniqueName: \"kubernetes.io/projected/e2555dac-6f8e-4fb5-9281-f21c939c8077-kube-api-access-b5478\") pod \"e2555dac-6f8e-4fb5-9281-f21c939c8077\" (UID: \"e2555dac-6f8e-4fb5-9281-f21c939c8077\") " Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.071275 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2555dac-6f8e-4fb5-9281-f21c939c8077-kube-api-access-b5478" (OuterVolumeSpecName: "kube-api-access-b5478") pod "e2555dac-6f8e-4fb5-9281-f21c939c8077" (UID: "e2555dac-6f8e-4fb5-9281-f21c939c8077"). InnerVolumeSpecName "kube-api-access-b5478". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.089116 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2555dac-6f8e-4fb5-9281-f21c939c8077" (UID: "e2555dac-6f8e-4fb5-9281-f21c939c8077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.098681 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-config-data" (OuterVolumeSpecName: "config-data") pod "e2555dac-6f8e-4fb5-9281-f21c939c8077" (UID: "e2555dac-6f8e-4fb5-9281-f21c939c8077"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.163212 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.163239 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2555dac-6f8e-4fb5-9281-f21c939c8077-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.163249 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5478\" (UniqueName: \"kubernetes.io/projected/e2555dac-6f8e-4fb5-9281-f21c939c8077-kube-api-access-b5478\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.176753 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.379567 4799 generic.go:334] "Generic (PLEG): container finished" podID="e2555dac-6f8e-4fb5-9281-f21c939c8077" containerID="58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86" exitCode=0 Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.379727 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.379761 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2555dac-6f8e-4fb5-9281-f21c939c8077","Type":"ContainerDied","Data":"58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86"} Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.380631 4799 scope.go:117] "RemoveContainer" containerID="58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.381247 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e2555dac-6f8e-4fb5-9281-f21c939c8077","Type":"ContainerDied","Data":"9b036b39f1171f5031ff4125e2816942689c488e5d16bdd7f641151d613c436a"} Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.401869 4799 scope.go:117] "RemoveContainer" containerID="58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86" Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:32.402259 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86\": container with ID starting with 58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86 not found: ID does not exist" containerID="58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.402292 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86"} err="failed to get container status \"58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86\": rpc error: code = NotFound desc = could not find container \"58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86\": container with ID starting with 58ba2a31a9303738a11c0b3c45d96818a9509721761e430b553afd8357d11a86 not found: ID does not exist" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.421400 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.450178 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.466474 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: E0319 20:26:32.467026 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2555dac-6f8e-4fb5-9281-f21c939c8077" containerName="nova-scheduler-scheduler" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.467047 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2555dac-6f8e-4fb5-9281-f21c939c8077" containerName="nova-scheduler-scheduler" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.467220 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2555dac-6f8e-4fb5-9281-f21c939c8077" containerName="nova-scheduler-scheduler" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.467951 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.469906 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.479241 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.571241 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2add32a8-30b6-4000-bdd7-c96cda2bb599-config-data\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.571321 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvt4q\" (UniqueName: \"kubernetes.io/projected/2add32a8-30b6-4000-bdd7-c96cda2bb599-kube-api-access-lvt4q\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.571351 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add32a8-30b6-4000-bdd7-c96cda2bb599-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.657755 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 20:26:32 crc kubenswrapper[4799]: W0319 20:26:32.666085 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf822edd3_bebb_4a8a_9755_60419527dbde.slice/crio-316af82cd4be1705e8aaaf6f32946e02d438871d3d722b396705fb4702f3b25c WatchSource:0}: Error finding container 316af82cd4be1705e8aaaf6f32946e02d438871d3d722b396705fb4702f3b25c: Status 404 returned error can't find the container with id 316af82cd4be1705e8aaaf6f32946e02d438871d3d722b396705fb4702f3b25c Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.672715 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2add32a8-30b6-4000-bdd7-c96cda2bb599-config-data\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.672824 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add32a8-30b6-4000-bdd7-c96cda2bb599-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.672903 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvt4q\" (UniqueName: \"kubernetes.io/projected/2add32a8-30b6-4000-bdd7-c96cda2bb599-kube-api-access-lvt4q\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.679906 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2add32a8-30b6-4000-bdd7-c96cda2bb599-config-data\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.680498 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add32a8-30b6-4000-bdd7-c96cda2bb599-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.690130 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvt4q\" (UniqueName: \"kubernetes.io/projected/2add32a8-30b6-4000-bdd7-c96cda2bb599-kube-api-access-lvt4q\") pod \"nova-scheduler-0\" (UID: \"2add32a8-30b6-4000-bdd7-c96cda2bb599\") " pod="openstack/nova-scheduler-0" Mar 19 20:26:32 crc kubenswrapper[4799]: I0319 20:26:32.792455 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.128194 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2555dac-6f8e-4fb5-9281-f21c939c8077" path="/var/lib/kubelet/pods/e2555dac-6f8e-4fb5-9281-f21c939c8077/volumes" Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.129218 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f666539f-aa23-40f9-b6cb-6cd15d8c1729" path="/var/lib/kubelet/pods/f666539f-aa23-40f9-b6cb-6cd15d8c1729/volumes" Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.307212 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 20:26:33 crc kubenswrapper[4799]: W0319 20:26:33.314583 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2add32a8_30b6_4000_bdd7_c96cda2bb599.slice/crio-cd185f687397a41b2f8db9c578ff74362f593daf022e9bdb53e6c6303ed6639d WatchSource:0}: Error finding container cd185f687397a41b2f8db9c578ff74362f593daf022e9bdb53e6c6303ed6639d: Status 404 returned error can't find the container with id cd185f687397a41b2f8db9c578ff74362f593daf022e9bdb53e6c6303ed6639d Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.406916 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2add32a8-30b6-4000-bdd7-c96cda2bb599","Type":"ContainerStarted","Data":"cd185f687397a41b2f8db9c578ff74362f593daf022e9bdb53e6c6303ed6639d"} Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.409767 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f822edd3-bebb-4a8a-9755-60419527dbde","Type":"ContainerStarted","Data":"6feebe2d7438f5160a79341ae65132e1fdbfcf6a1fedb72651f625429e140243"} Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.409815 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f822edd3-bebb-4a8a-9755-60419527dbde","Type":"ContainerStarted","Data":"2c07da02dc35a487a9f3c230106234be00b7f886dca0ee5503318f39fb672ecd"} Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.409826 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f822edd3-bebb-4a8a-9755-60419527dbde","Type":"ContainerStarted","Data":"316af82cd4be1705e8aaaf6f32946e02d438871d3d722b396705fb4702f3b25c"} Mar 19 20:26:33 crc kubenswrapper[4799]: I0319 20:26:33.432353 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.432329566 podStartE2EDuration="2.432329566s" podCreationTimestamp="2026-03-19 20:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:33.427064069 +0000 UTC m=+1271.033017151" watchObservedRunningTime="2026-03-19 20:26:33.432329566 +0000 UTC m=+1271.038282638" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.183848 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.321291 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-824r6\" (UniqueName: \"kubernetes.io/projected/4dda1e2e-c642-491e-ba96-26e9638bc902-kube-api-access-824r6\") pod \"4dda1e2e-c642-491e-ba96-26e9638bc902\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.321346 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-config-data\") pod \"4dda1e2e-c642-491e-ba96-26e9638bc902\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.321595 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-combined-ca-bundle\") pod \"4dda1e2e-c642-491e-ba96-26e9638bc902\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.321642 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dda1e2e-c642-491e-ba96-26e9638bc902-logs\") pod \"4dda1e2e-c642-491e-ba96-26e9638bc902\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.321686 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-nova-metadata-tls-certs\") pod \"4dda1e2e-c642-491e-ba96-26e9638bc902\" (UID: \"4dda1e2e-c642-491e-ba96-26e9638bc902\") " Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.323048 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dda1e2e-c642-491e-ba96-26e9638bc902-logs" (OuterVolumeSpecName: "logs") pod "4dda1e2e-c642-491e-ba96-26e9638bc902" (UID: "4dda1e2e-c642-491e-ba96-26e9638bc902"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.326932 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dda1e2e-c642-491e-ba96-26e9638bc902-kube-api-access-824r6" (OuterVolumeSpecName: "kube-api-access-824r6") pod "4dda1e2e-c642-491e-ba96-26e9638bc902" (UID: "4dda1e2e-c642-491e-ba96-26e9638bc902"). InnerVolumeSpecName "kube-api-access-824r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.346887 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-config-data" (OuterVolumeSpecName: "config-data") pod "4dda1e2e-c642-491e-ba96-26e9638bc902" (UID: "4dda1e2e-c642-491e-ba96-26e9638bc902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.387556 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dda1e2e-c642-491e-ba96-26e9638bc902" (UID: "4dda1e2e-c642-491e-ba96-26e9638bc902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.389535 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4dda1e2e-c642-491e-ba96-26e9638bc902" (UID: "4dda1e2e-c642-491e-ba96-26e9638bc902"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.420735 4799 generic.go:334] "Generic (PLEG): container finished" podID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerID="fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c" exitCode=0 Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.420842 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4dda1e2e-c642-491e-ba96-26e9638bc902","Type":"ContainerDied","Data":"fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c"} Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.420885 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4dda1e2e-c642-491e-ba96-26e9638bc902","Type":"ContainerDied","Data":"c370cc172844760b98db0917b14890e91f070b597dfca14ce691a2ce781f975b"} Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.420911 4799 scope.go:117] "RemoveContainer" containerID="fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.421210 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.423571 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.423589 4799 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dda1e2e-c642-491e-ba96-26e9638bc902-logs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.423601 4799 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.423612 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-824r6\" (UniqueName: \"kubernetes.io/projected/4dda1e2e-c642-491e-ba96-26e9638bc902-kube-api-access-824r6\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.423622 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda1e2e-c642-491e-ba96-26e9638bc902-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.425121 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2add32a8-30b6-4000-bdd7-c96cda2bb599","Type":"ContainerStarted","Data":"65a28bcdf281c645b8843bbfe2a5eeddef9745c37a80f79b857446e936f252dc"} Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.444732 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.444696963 podStartE2EDuration="2.444696963s" podCreationTimestamp="2026-03-19 20:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:34.438863161 +0000 UTC m=+1272.044816243" watchObservedRunningTime="2026-03-19 20:26:34.444696963 +0000 UTC m=+1272.050650035" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.454486 4799 scope.go:117] "RemoveContainer" containerID="ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.471960 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.484023 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.486609 4799 scope.go:117] "RemoveContainer" containerID="fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c" Mar 19 20:26:34 crc kubenswrapper[4799]: E0319 20:26:34.487360 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c\": container with ID starting with fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c not found: ID does not exist" containerID="fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.487403 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c"} err="failed to get container status \"fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c\": rpc error: code = NotFound desc = could not find container \"fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c\": container with ID starting with fce3f79145056d18c1236b8a2d084eab3853c160c2590c84f32b269800294a6c not found: ID does not exist" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.487423 4799 scope.go:117] "RemoveContainer" containerID="ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476" Mar 19 20:26:34 crc kubenswrapper[4799]: E0319 20:26:34.488128 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476\": container with ID starting with ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476 not found: ID does not exist" containerID="ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.488163 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476"} err="failed to get container status \"ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476\": rpc error: code = NotFound desc = could not find container \"ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476\": container with ID starting with ec78e20c8ac1db23faf5113da6bb95591b9fb340ea5796564844ea0ae84e2476 not found: ID does not exist" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.500961 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:26:34 crc kubenswrapper[4799]: E0319 20:26:34.501426 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-metadata" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.501444 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-metadata" Mar 19 20:26:34 crc kubenswrapper[4799]: E0319 20:26:34.501459 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-log" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.501465 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-log" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.501647 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-metadata" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.501662 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" containerName="nova-metadata-log" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.502900 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.505095 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.505177 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.508964 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.628483 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.628657 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-config-data\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.628699 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-logs\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.628781 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.628826 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxzz\" (UniqueName: \"kubernetes.io/projected/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-kube-api-access-6xxzz\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.731165 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.731245 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxzz\" (UniqueName: \"kubernetes.io/projected/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-kube-api-access-6xxzz\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.731291 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.731468 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-config-data\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.734640 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-logs\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.735473 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-logs\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.736870 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-config-data\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.737564 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.737759 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.751041 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxzz\" (UniqueName: \"kubernetes.io/projected/a1b0148b-b48b-44b2-9fee-1fd4389fbf77-kube-api-access-6xxzz\") pod \"nova-metadata-0\" (UID: \"a1b0148b-b48b-44b2-9fee-1fd4389fbf77\") " pod="openstack/nova-metadata-0" Mar 19 20:26:34 crc kubenswrapper[4799]: I0319 20:26:34.817908 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 20:26:35 crc kubenswrapper[4799]: I0319 20:26:35.127691 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dda1e2e-c642-491e-ba96-26e9638bc902" path="/var/lib/kubelet/pods/4dda1e2e-c642-491e-ba96-26e9638bc902/volumes" Mar 19 20:26:35 crc kubenswrapper[4799]: I0319 20:26:35.372803 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 20:26:35 crc kubenswrapper[4799]: I0319 20:26:35.435913 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a1b0148b-b48b-44b2-9fee-1fd4389fbf77","Type":"ContainerStarted","Data":"1864fe3309a98a191d839365007e185609d595fc449bd459fc7bda6ce399bb56"} Mar 19 20:26:36 crc kubenswrapper[4799]: I0319 20:26:36.453832 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a1b0148b-b48b-44b2-9fee-1fd4389fbf77","Type":"ContainerStarted","Data":"5e4bb7c8fcd45e8d75dc136371893576c485c9c7be7d51e6798a8aca6769ee5c"} Mar 19 20:26:36 crc kubenswrapper[4799]: I0319 20:26:36.454317 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a1b0148b-b48b-44b2-9fee-1fd4389fbf77","Type":"ContainerStarted","Data":"d73cc99456cdb3edaf4f9a87612699f676c1af6973bf36246b71be2a85c78ead"} Mar 19 20:26:36 crc kubenswrapper[4799]: I0319 20:26:36.511585 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.511558421 podStartE2EDuration="2.511558421s" podCreationTimestamp="2026-03-19 20:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:26:36.486555694 +0000 UTC m=+1274.092508826" watchObservedRunningTime="2026-03-19 20:26:36.511558421 +0000 UTC m=+1274.117511533" Mar 19 20:26:37 crc kubenswrapper[4799]: I0319 20:26:37.792674 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 20:26:42 crc kubenswrapper[4799]: I0319 20:26:42.177607 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 20:26:42 crc kubenswrapper[4799]: I0319 20:26:42.178804 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 20:26:42 crc kubenswrapper[4799]: I0319 20:26:42.793823 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 20:26:42 crc kubenswrapper[4799]: I0319 20:26:42.852089 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 20:26:43 crc kubenswrapper[4799]: I0319 20:26:43.193594 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f822edd3-bebb-4a8a-9755-60419527dbde" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:26:43 crc kubenswrapper[4799]: I0319 20:26:43.193955 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f822edd3-bebb-4a8a-9755-60419527dbde" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:26:43 crc kubenswrapper[4799]: I0319 20:26:43.580822 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 20:26:44 crc kubenswrapper[4799]: I0319 20:26:44.818795 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 20:26:44 crc kubenswrapper[4799]: I0319 20:26:44.820502 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 20:26:45 crc kubenswrapper[4799]: I0319 20:26:45.843659 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a1b0148b-b48b-44b2-9fee-1fd4389fbf77" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:26:45 crc kubenswrapper[4799]: I0319 20:26:45.843714 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a1b0148b-b48b-44b2-9fee-1fd4389fbf77" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:26:48 crc kubenswrapper[4799]: I0319 20:26:48.579707 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 20:26:50 crc kubenswrapper[4799]: I0319 20:26:50.177602 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 20:26:50 crc kubenswrapper[4799]: I0319 20:26:50.177689 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 20:26:52 crc kubenswrapper[4799]: I0319 20:26:52.186123 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 20:26:52 crc kubenswrapper[4799]: I0319 20:26:52.190043 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 20:26:52 crc kubenswrapper[4799]: I0319 20:26:52.193856 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 20:26:52 crc kubenswrapper[4799]: I0319 20:26:52.665796 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 20:26:52 crc kubenswrapper[4799]: I0319 20:26:52.819339 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 20:26:52 crc kubenswrapper[4799]: I0319 20:26:52.819751 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 20:26:54 crc kubenswrapper[4799]: I0319 20:26:54.831306 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 20:26:54 crc kubenswrapper[4799]: I0319 20:26:54.831462 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 20:26:54 crc kubenswrapper[4799]: I0319 20:26:54.840479 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 20:26:54 crc kubenswrapper[4799]: I0319 20:26:54.841068 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 20:26:56 crc kubenswrapper[4799]: I0319 20:26:56.072158 4799 scope.go:117] "RemoveContainer" containerID="ab7a7269c37b8ab72c829be828b09c90cf05f52c872b4863329f824043350000" Mar 19 20:26:58 crc kubenswrapper[4799]: I0319 20:26:58.755480 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:26:58 crc kubenswrapper[4799]: I0319 20:26:58.756126 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:27:02 crc kubenswrapper[4799]: I0319 20:27:02.545997 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:27:03 crc kubenswrapper[4799]: I0319 20:27:03.964875 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:27:06 crc kubenswrapper[4799]: I0319 20:27:06.709524 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="rabbitmq" containerID="cri-o://f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb" gracePeriod=604796 Mar 19 20:27:08 crc kubenswrapper[4799]: I0319 20:27:08.348150 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="rabbitmq" containerID="cri-o://2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d" gracePeriod=604796 Mar 19 20:27:09 crc kubenswrapper[4799]: I0319 20:27:09.252285 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 19 20:27:09 crc kubenswrapper[4799]: I0319 20:27:09.322903 4799 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.377743 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509370 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509506 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-plugins\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509530 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/749a043f-5262-416d-b639-9ff8fdcf7f12-pod-info\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509565 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-config-data\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509603 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-erlang-cookie\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509622 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-confd\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509643 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9cfx\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-kube-api-access-z9cfx\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509726 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-tls\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509742 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-server-conf\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509766 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-plugins-conf\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.509785 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/749a043f-5262-416d-b639-9ff8fdcf7f12-erlang-cookie-secret\") pod \"749a043f-5262-416d-b639-9ff8fdcf7f12\" (UID: \"749a043f-5262-416d-b639-9ff8fdcf7f12\") " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.510264 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.511004 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.512261 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.517073 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.518555 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.520921 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-kube-api-access-z9cfx" (OuterVolumeSpecName: "kube-api-access-z9cfx") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "kube-api-access-z9cfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.523555 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749a043f-5262-416d-b639-9ff8fdcf7f12-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.525196 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/749a043f-5262-416d-b639-9ff8fdcf7f12-pod-info" (OuterVolumeSpecName: "pod-info") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.552474 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-config-data" (OuterVolumeSpecName: "config-data") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.579795 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-server-conf" (OuterVolumeSpecName: "server-conf") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612418 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612578 4799 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612657 4799 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612727 4799 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/749a043f-5262-416d-b639-9ff8fdcf7f12-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612811 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612871 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612925 4799 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/749a043f-5262-416d-b639-9ff8fdcf7f12-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.612986 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/749a043f-5262-416d-b639-9ff8fdcf7f12-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.613048 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.613101 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9cfx\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-kube-api-access-z9cfx\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.634762 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "749a043f-5262-416d-b639-9ff8fdcf7f12" (UID: "749a043f-5262-416d-b639-9ff8fdcf7f12"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.661566 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.714308 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.714337 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/749a043f-5262-416d-b639-9ff8fdcf7f12-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.885511 4799 generic.go:334] "Generic (PLEG): container finished" podID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerID="f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb" exitCode=0 Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.885551 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"749a043f-5262-416d-b639-9ff8fdcf7f12","Type":"ContainerDied","Data":"f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb"} Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.885575 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"749a043f-5262-416d-b639-9ff8fdcf7f12","Type":"ContainerDied","Data":"e874606cd843e1968c2986c7c98cecacf31032b9c2b1a34185f32ca0cfb6edf7"} Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.885591 4799 scope.go:117] "RemoveContainer" containerID="f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.885745 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.920793 4799 scope.go:117] "RemoveContainer" containerID="d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.931098 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.940212 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.947207 4799 scope.go:117] "RemoveContainer" containerID="f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb" Mar 19 20:27:13 crc kubenswrapper[4799]: E0319 20:27:13.947581 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb\": container with ID starting with f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb not found: ID does not exist" containerID="f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.947610 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb"} err="failed to get container status \"f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb\": rpc error: code = NotFound desc = could not find container \"f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb\": container with ID starting with f71273a770b5a80961a8bbd50d69e1f53995551208f4eaa090163c8f0fd009bb not found: ID does not exist" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.947629 4799 scope.go:117] "RemoveContainer" containerID="d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074" Mar 19 20:27:13 crc kubenswrapper[4799]: E0319 20:27:13.948061 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074\": container with ID starting with d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074 not found: ID does not exist" containerID="d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.948104 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074"} err="failed to get container status \"d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074\": rpc error: code = NotFound desc = could not find container \"d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074\": container with ID starting with d3947347830b88deac8b6c400780279bc93369d869c1acb31b8851f0ac7ec074 not found: ID does not exist" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.973413 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:27:13 crc kubenswrapper[4799]: E0319 20:27:13.973904 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="setup-container" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.973924 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="setup-container" Mar 19 20:27:13 crc kubenswrapper[4799]: E0319 20:27:13.973943 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="rabbitmq" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.973949 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="rabbitmq" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.974128 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" containerName="rabbitmq" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.975179 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.978947 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.978983 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.978947 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.979192 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.981046 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.981489 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.983351 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lfwkh" Mar 19 20:27:13 crc kubenswrapper[4799]: I0319 20:27:13.988935 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121573 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121639 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121726 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-config-data\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121761 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121803 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jpzn\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-kube-api-access-5jpzn\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121823 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.121992 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.122068 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.122150 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3117828b-97c2-41b6-a48d-cf7154e2bb71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.122222 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.122336 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3117828b-97c2-41b6-a48d-cf7154e2bb71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.224642 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3117828b-97c2-41b6-a48d-cf7154e2bb71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.225820 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.226910 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.227751 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3117828b-97c2-41b6-a48d-cf7154e2bb71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228145 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228414 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-config-data\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228482 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228554 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228651 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jpzn\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-kube-api-access-5jpzn\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228704 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228767 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.228973 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.229074 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.229536 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-config-data\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.229900 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.230699 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3117828b-97c2-41b6-a48d-cf7154e2bb71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.232695 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3117828b-97c2-41b6-a48d-cf7154e2bb71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.232772 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.233459 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.234140 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3117828b-97c2-41b6-a48d-cf7154e2bb71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.258284 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jpzn\" (UniqueName: \"kubernetes.io/projected/3117828b-97c2-41b6-a48d-cf7154e2bb71-kube-api-access-5jpzn\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.266844 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3117828b-97c2-41b6-a48d-cf7154e2bb71\") " pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.334821 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.832708 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.848685 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.937328 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3117828b-97c2-41b6-a48d-cf7154e2bb71","Type":"ContainerStarted","Data":"1acee6212a6debbbf41d14da45309ef32b52eec95930446b335470fc2a932ff6"} Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.941828 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerID="2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d" exitCode=0 Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.941871 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ee15a17-4d32-468e-8a57-2a597cebd850","Type":"ContainerDied","Data":"2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d"} Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.941896 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2ee15a17-4d32-468e-8a57-2a597cebd850","Type":"ContainerDied","Data":"3cbc7a6aeda24d6a8b2a9246dcce136892709025fa19de5ce5622db489182542"} Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.941923 4799 scope.go:117] "RemoveContainer" containerID="2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.942079 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.945883 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-tls\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.946148 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-confd\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.946257 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-plugins\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.946281 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-config-data\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.946301 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.946347 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ee15a17-4d32-468e-8a57-2a597cebd850-pod-info\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.946432 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-plugins-conf\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.947678 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.949005 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ee15a17-4d32-468e-8a57-2a597cebd850-erlang-cookie-secret\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.949050 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6b2h\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-kube-api-access-h6b2h\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.949323 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-server-conf\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.949629 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2ee15a17-4d32-468e-8a57-2a597cebd850-pod-info" (OuterVolumeSpecName: "pod-info") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.963679 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.965644 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-erlang-cookie\") pod \"2ee15a17-4d32-468e-8a57-2a597cebd850\" (UID: \"2ee15a17-4d32-468e-8a57-2a597cebd850\") " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.966755 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.967596 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.967705 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969111 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-kube-api-access-h6b2h" (OuterVolumeSpecName: "kube-api-access-h6b2h") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "kube-api-access-h6b2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969355 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969422 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969437 4799 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2ee15a17-4d32-468e-8a57-2a597cebd850-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969447 4799 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969459 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6b2h\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-kube-api-access-h6b2h\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969471 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.969481 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:14 crc kubenswrapper[4799]: I0319 20:27:14.970639 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee15a17-4d32-468e-8a57-2a597cebd850-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.006750 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.034430 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-config-data" (OuterVolumeSpecName: "config-data") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.065596 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-server-conf" (OuterVolumeSpecName: "server-conf") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.071324 4799 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2ee15a17-4d32-468e-8a57-2a597cebd850-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.071362 4799 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.071373 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ee15a17-4d32-468e-8a57-2a597cebd850-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.071401 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.114587 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2ee15a17-4d32-468e-8a57-2a597cebd850" (UID: "2ee15a17-4d32-468e-8a57-2a597cebd850"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.128765 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749a043f-5262-416d-b639-9ff8fdcf7f12" path="/var/lib/kubelet/pods/749a043f-5262-416d-b639-9ff8fdcf7f12/volumes" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.172847 4799 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2ee15a17-4d32-468e-8a57-2a597cebd850-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.186902 4799 scope.go:117] "RemoveContainer" containerID="37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.218661 4799 scope.go:117] "RemoveContainer" containerID="2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d" Mar 19 20:27:15 crc kubenswrapper[4799]: E0319 20:27:15.219092 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d\": container with ID starting with 2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d not found: ID does not exist" containerID="2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.219135 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d"} err="failed to get container status \"2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d\": rpc error: code = NotFound desc = could not find container \"2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d\": container with ID starting with 2319c4bdc4233f063d9a08472678007222db22067e07fe110c3c4872a465518d not found: ID does not exist" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.219161 4799 scope.go:117] "RemoveContainer" containerID="37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e" Mar 19 20:27:15 crc kubenswrapper[4799]: E0319 20:27:15.219527 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e\": container with ID starting with 37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e not found: ID does not exist" containerID="37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.219569 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e"} err="failed to get container status \"37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e\": rpc error: code = NotFound desc = could not find container \"37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e\": container with ID starting with 37738696410aff59b3692d0c78e4650f7c93abdd7a7b2bd06ccd0036fa01607e not found: ID does not exist" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.267786 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.281549 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.295450 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:27:15 crc kubenswrapper[4799]: E0319 20:27:15.296199 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="setup-container" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.296289 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="setup-container" Mar 19 20:27:15 crc kubenswrapper[4799]: E0319 20:27:15.296379 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="rabbitmq" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.296465 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="rabbitmq" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.296794 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" containerName="rabbitmq" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.298156 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.300022 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.300031 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.301555 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.303491 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.304691 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.312891 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.313092 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v9gjw" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.313378 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.478875 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwwh\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-kube-api-access-6hwwh\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.479597 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.479780 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.480104 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.480341 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.480587 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1c5d7a-7501-4c34-9823-c996a2413399-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.481024 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.481328 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.482074 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.482252 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1c5d7a-7501-4c34-9823-c996a2413399-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.482441 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.584849 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwwh\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-kube-api-access-6hwwh\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.585261 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.585574 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.585643 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586169 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586238 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586286 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1c5d7a-7501-4c34-9823-c996a2413399-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586355 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586403 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586437 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586464 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1c5d7a-7501-4c34-9823-c996a2413399-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586502 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.586926 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.587140 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.587352 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.587411 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.587803 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b1c5d7a-7501-4c34-9823-c996a2413399-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.591315 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.592020 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.593042 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b1c5d7a-7501-4c34-9823-c996a2413399-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.596047 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b1c5d7a-7501-4c34-9823-c996a2413399-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.605565 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwwh\" (UniqueName: \"kubernetes.io/projected/1b1c5d7a-7501-4c34-9823-c996a2413399-kube-api-access-6hwwh\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.623087 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658d9cd857-b7h82"] Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.625035 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.632490 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658d9cd857-b7h82"] Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.659867 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.667203 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b1c5d7a-7501-4c34-9823-c996a2413399\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.789556 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.790275 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.790357 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-swift-storage-0\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.790423 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-svc\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.790450 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.790568 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-config\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.790631 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7bx\" (UniqueName: \"kubernetes.io/projected/3365676b-9535-4e50-93c5-2adb6f8ec4b9-kube-api-access-dp7bx\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.891985 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.892127 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.892157 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-swift-storage-0\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.892212 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-svc\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.892238 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.892269 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-config\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.892306 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7bx\" (UniqueName: \"kubernetes.io/projected/3365676b-9535-4e50-93c5-2adb6f8ec4b9-kube-api-access-dp7bx\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.893105 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.893558 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.893732 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-svc\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.893956 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-config\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.894049 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.894073 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-swift-storage-0\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.923311 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:15 crc kubenswrapper[4799]: I0319 20:27:15.974340 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7bx\" (UniqueName: \"kubernetes.io/projected/3365676b-9535-4e50-93c5-2adb6f8ec4b9-kube-api-access-dp7bx\") pod \"dnsmasq-dns-658d9cd857-b7h82\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.004264 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.551271 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 20:27:16 crc kubenswrapper[4799]: W0319 20:27:16.579440 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1c5d7a_7501_4c34_9823_c996a2413399.slice/crio-21c3848fb02e1dcd322bdd591aca6f911c078cf6ffb53cec9bfbf2e0d206f078 WatchSource:0}: Error finding container 21c3848fb02e1dcd322bdd591aca6f911c078cf6ffb53cec9bfbf2e0d206f078: Status 404 returned error can't find the container with id 21c3848fb02e1dcd322bdd591aca6f911c078cf6ffb53cec9bfbf2e0d206f078 Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.650923 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658d9cd857-b7h82"] Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.959928 4799 generic.go:334] "Generic (PLEG): container finished" podID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerID="a9b3037206f8eef45758ab3c05a920ffce064f2e9464418e831fadf937e81d39" exitCode=0 Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.959989 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" event={"ID":"3365676b-9535-4e50-93c5-2adb6f8ec4b9","Type":"ContainerDied","Data":"a9b3037206f8eef45758ab3c05a920ffce064f2e9464418e831fadf937e81d39"} Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.960044 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" event={"ID":"3365676b-9535-4e50-93c5-2adb6f8ec4b9","Type":"ContainerStarted","Data":"d05a60a131fcfc84a0059605ba51b5d9d85720a2dbff7325a48af3d36b15f217"} Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.962122 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b1c5d7a-7501-4c34-9823-c996a2413399","Type":"ContainerStarted","Data":"21c3848fb02e1dcd322bdd591aca6f911c078cf6ffb53cec9bfbf2e0d206f078"} Mar 19 20:27:16 crc kubenswrapper[4799]: I0319 20:27:16.964519 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3117828b-97c2-41b6-a48d-cf7154e2bb71","Type":"ContainerStarted","Data":"de7203a57bf668d5ffed43ffd811481d1d9f2d92ba84af317e6a21af85a2af6b"} Mar 19 20:27:17 crc kubenswrapper[4799]: I0319 20:27:17.138599 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee15a17-4d32-468e-8a57-2a597cebd850" path="/var/lib/kubelet/pods/2ee15a17-4d32-468e-8a57-2a597cebd850/volumes" Mar 19 20:27:17 crc kubenswrapper[4799]: I0319 20:27:17.979240 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" event={"ID":"3365676b-9535-4e50-93c5-2adb6f8ec4b9","Type":"ContainerStarted","Data":"091c67f71c1e7ef5682647a89e6fa63babd29ed43d3973669e4568cb50bf4285"} Mar 19 20:27:18 crc kubenswrapper[4799]: I0319 20:27:18.011615 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" podStartSLOduration=3.011559812 podStartE2EDuration="3.011559812s" podCreationTimestamp="2026-03-19 20:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:27:18.009175436 +0000 UTC m=+1315.615128508" watchObservedRunningTime="2026-03-19 20:27:18.011559812 +0000 UTC m=+1315.617512914" Mar 19 20:27:18 crc kubenswrapper[4799]: I0319 20:27:18.989348 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b1c5d7a-7501-4c34-9823-c996a2413399","Type":"ContainerStarted","Data":"15d2c22e7a1e7c75c2fce6b608ee35a01d7491385b7cb01847414489c9613ee9"} Mar 19 20:27:18 crc kubenswrapper[4799]: I0319 20:27:18.989692 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.005665 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.122526 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-ccgcs"] Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.122959 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerName="dnsmasq-dns" containerID="cri-o://f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2" gracePeriod=10 Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.360206 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79db78f56f-q2stc"] Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.361951 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.384468 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79db78f56f-q2stc"] Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.418696 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-openstack-edpm-ipam\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.418746 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-dns-svc\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.418767 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-ovsdbserver-nb\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.418832 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-config\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.418923 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-dns-swift-storage-0\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.418983 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf8nl\" (UniqueName: \"kubernetes.io/projected/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-kube-api-access-qf8nl\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.419050 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-ovsdbserver-sb\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521131 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-ovsdbserver-sb\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521221 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-openstack-edpm-ipam\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521245 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-dns-svc\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521267 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-ovsdbserver-nb\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521292 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-config\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521339 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-dns-swift-storage-0\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.521399 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf8nl\" (UniqueName: \"kubernetes.io/projected/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-kube-api-access-qf8nl\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.522114 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-dns-svc\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.522239 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-ovsdbserver-sb\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.522401 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-config\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.522463 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-dns-swift-storage-0\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.523751 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-openstack-edpm-ipam\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.524051 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-ovsdbserver-nb\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.548380 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf8nl\" (UniqueName: \"kubernetes.io/projected/9ae6606b-efd1-4bf8-a13f-c14d96bbaa99-kube-api-access-qf8nl\") pod \"dnsmasq-dns-79db78f56f-q2stc\" (UID: \"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99\") " pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.620249 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.622371 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-svc\") pod \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.622767 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-sb\") pod \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.622830 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-config\") pod \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.622895 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdgcj\" (UniqueName: \"kubernetes.io/projected/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-kube-api-access-bdgcj\") pod \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.622963 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-nb\") pod \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.623021 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-swift-storage-0\") pod \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\" (UID: \"08ab47f7-ed2d-457a-9b03-31a54cd2f62e\") " Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.630452 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-kube-api-access-bdgcj" (OuterVolumeSpecName: "kube-api-access-bdgcj") pod "08ab47f7-ed2d-457a-9b03-31a54cd2f62e" (UID: "08ab47f7-ed2d-457a-9b03-31a54cd2f62e"). InnerVolumeSpecName "kube-api-access-bdgcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.679016 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.683138 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-config" (OuterVolumeSpecName: "config") pod "08ab47f7-ed2d-457a-9b03-31a54cd2f62e" (UID: "08ab47f7-ed2d-457a-9b03-31a54cd2f62e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.702880 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08ab47f7-ed2d-457a-9b03-31a54cd2f62e" (UID: "08ab47f7-ed2d-457a-9b03-31a54cd2f62e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.708870 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08ab47f7-ed2d-457a-9b03-31a54cd2f62e" (UID: "08ab47f7-ed2d-457a-9b03-31a54cd2f62e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.717817 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08ab47f7-ed2d-457a-9b03-31a54cd2f62e" (UID: "08ab47f7-ed2d-457a-9b03-31a54cd2f62e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.719382 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08ab47f7-ed2d-457a-9b03-31a54cd2f62e" (UID: "08ab47f7-ed2d-457a-9b03-31a54cd2f62e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.724012 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.724043 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.724053 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdgcj\" (UniqueName: \"kubernetes.io/projected/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-kube-api-access-bdgcj\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.724064 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.724073 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:26 crc kubenswrapper[4799]: I0319 20:27:26.724083 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ab47f7-ed2d-457a-9b03-31a54cd2f62e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.096922 4799 generic.go:334] "Generic (PLEG): container finished" podID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerID="f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2" exitCode=0 Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.097317 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" event={"ID":"08ab47f7-ed2d-457a-9b03-31a54cd2f62e","Type":"ContainerDied","Data":"f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2"} Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.097360 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" event={"ID":"08ab47f7-ed2d-457a-9b03-31a54cd2f62e","Type":"ContainerDied","Data":"07c0e08bbdff528858aca040c950372f8217da76f222008fdc9be049cf9745dd"} Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.097428 4799 scope.go:117] "RemoveContainer" containerID="f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.097524 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-ccgcs" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.127161 4799 scope.go:117] "RemoveContainer" containerID="41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.156024 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-ccgcs"] Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.171578 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-ccgcs"] Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.172778 4799 scope.go:117] "RemoveContainer" containerID="f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2" Mar 19 20:27:27 crc kubenswrapper[4799]: E0319 20:27:27.173189 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2\": container with ID starting with f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2 not found: ID does not exist" containerID="f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.173228 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2"} err="failed to get container status \"f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2\": rpc error: code = NotFound desc = could not find container \"f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2\": container with ID starting with f1f8f5e0c81f8c5d752f40300055e2fa2472ebd9b39af6b0e8ce7c6b96386ef2 not found: ID does not exist" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.173255 4799 scope.go:117] "RemoveContainer" containerID="41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be" Mar 19 20:27:27 crc kubenswrapper[4799]: E0319 20:27:27.173662 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be\": container with ID starting with 41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be not found: ID does not exist" containerID="41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.173698 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be"} err="failed to get container status \"41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be\": rpc error: code = NotFound desc = could not find container \"41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be\": container with ID starting with 41494a4cb40a1c2c50549c9e7ea6d61809c47a17ceae91b931dd8ecd34cb60be not found: ID does not exist" Mar 19 20:27:27 crc kubenswrapper[4799]: I0319 20:27:27.182195 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79db78f56f-q2stc"] Mar 19 20:27:27 crc kubenswrapper[4799]: W0319 20:27:27.182843 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae6606b_efd1_4bf8_a13f_c14d96bbaa99.slice/crio-077af4bfb3120a94851b37a7b0c235ce6941d39046f003f007dd4f474edc1697 WatchSource:0}: Error finding container 077af4bfb3120a94851b37a7b0c235ce6941d39046f003f007dd4f474edc1697: Status 404 returned error can't find the container with id 077af4bfb3120a94851b37a7b0c235ce6941d39046f003f007dd4f474edc1697 Mar 19 20:27:28 crc kubenswrapper[4799]: I0319 20:27:28.113989 4799 generic.go:334] "Generic (PLEG): container finished" podID="9ae6606b-efd1-4bf8-a13f-c14d96bbaa99" containerID="2e21f876864454fd16adbc1444cfe72d4d66ba098132eefb26caf42a7f072830" exitCode=0 Mar 19 20:27:28 crc kubenswrapper[4799]: I0319 20:27:28.114105 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" event={"ID":"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99","Type":"ContainerDied","Data":"2e21f876864454fd16adbc1444cfe72d4d66ba098132eefb26caf42a7f072830"} Mar 19 20:27:28 crc kubenswrapper[4799]: I0319 20:27:28.115800 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" event={"ID":"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99","Type":"ContainerStarted","Data":"077af4bfb3120a94851b37a7b0c235ce6941d39046f003f007dd4f474edc1697"} Mar 19 20:27:28 crc kubenswrapper[4799]: I0319 20:27:28.756451 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:27:28 crc kubenswrapper[4799]: I0319 20:27:28.756539 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:27:29 crc kubenswrapper[4799]: I0319 20:27:29.137684 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" path="/var/lib/kubelet/pods/08ab47f7-ed2d-457a-9b03-31a54cd2f62e/volumes" Mar 19 20:27:29 crc kubenswrapper[4799]: I0319 20:27:29.139666 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" event={"ID":"9ae6606b-efd1-4bf8-a13f-c14d96bbaa99","Type":"ContainerStarted","Data":"921db1dde9e834e08bb940ce0735b13e12967c2a091d1396e9277ab71e5c5ac1"} Mar 19 20:27:29 crc kubenswrapper[4799]: I0319 20:27:29.139959 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:29 crc kubenswrapper[4799]: I0319 20:27:29.185787 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" podStartSLOduration=3.185756262 podStartE2EDuration="3.185756262s" podCreationTimestamp="2026-03-19 20:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:27:29.172918435 +0000 UTC m=+1326.778871547" watchObservedRunningTime="2026-03-19 20:27:29.185756262 +0000 UTC m=+1326.791709364" Mar 19 20:27:36 crc kubenswrapper[4799]: I0319 20:27:36.681587 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79db78f56f-q2stc" Mar 19 20:27:36 crc kubenswrapper[4799]: I0319 20:27:36.766466 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658d9cd857-b7h82"] Mar 19 20:27:36 crc kubenswrapper[4799]: I0319 20:27:36.766739 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerName="dnsmasq-dns" containerID="cri-o://091c67f71c1e7ef5682647a89e6fa63babd29ed43d3973669e4568cb50bf4285" gracePeriod=10 Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.245064 4799 generic.go:334] "Generic (PLEG): container finished" podID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerID="091c67f71c1e7ef5682647a89e6fa63babd29ed43d3973669e4568cb50bf4285" exitCode=0 Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.245489 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" event={"ID":"3365676b-9535-4e50-93c5-2adb6f8ec4b9","Type":"ContainerDied","Data":"091c67f71c1e7ef5682647a89e6fa63babd29ed43d3973669e4568cb50bf4285"} Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.245559 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" event={"ID":"3365676b-9535-4e50-93c5-2adb6f8ec4b9","Type":"ContainerDied","Data":"d05a60a131fcfc84a0059605ba51b5d9d85720a2dbff7325a48af3d36b15f217"} Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.245581 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d05a60a131fcfc84a0059605ba51b5d9d85720a2dbff7325a48af3d36b15f217" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.263459 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.366979 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-nb\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.367031 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-svc\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.367077 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-config\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.367138 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-openstack-edpm-ipam\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.367166 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp7bx\" (UniqueName: \"kubernetes.io/projected/3365676b-9535-4e50-93c5-2adb6f8ec4b9-kube-api-access-dp7bx\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.367218 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-swift-storage-0\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.367299 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-sb\") pod \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\" (UID: \"3365676b-9535-4e50-93c5-2adb6f8ec4b9\") " Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.376128 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3365676b-9535-4e50-93c5-2adb6f8ec4b9-kube-api-access-dp7bx" (OuterVolumeSpecName: "kube-api-access-dp7bx") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "kube-api-access-dp7bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.424834 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-config" (OuterVolumeSpecName: "config") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.427461 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.429248 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.438572 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.439211 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.442592 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3365676b-9535-4e50-93c5-2adb6f8ec4b9" (UID: "3365676b-9535-4e50-93c5-2adb6f8ec4b9"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.472829 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.473468 4799 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.473503 4799 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.473517 4799 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.473530 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.473543 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp7bx\" (UniqueName: \"kubernetes.io/projected/3365676b-9535-4e50-93c5-2adb6f8ec4b9-kube-api-access-dp7bx\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:37 crc kubenswrapper[4799]: I0319 20:27:37.473558 4799 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3365676b-9535-4e50-93c5-2adb6f8ec4b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:27:38 crc kubenswrapper[4799]: I0319 20:27:38.257767 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658d9cd857-b7h82" Mar 19 20:27:38 crc kubenswrapper[4799]: I0319 20:27:38.320299 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658d9cd857-b7h82"] Mar 19 20:27:38 crc kubenswrapper[4799]: I0319 20:27:38.358238 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658d9cd857-b7h82"] Mar 19 20:27:39 crc kubenswrapper[4799]: I0319 20:27:39.125500 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" path="/var/lib/kubelet/pods/3365676b-9535-4e50-93c5-2adb6f8ec4b9/volumes" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.358975 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck"] Mar 19 20:27:45 crc kubenswrapper[4799]: E0319 20:27:45.359517 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerName="dnsmasq-dns" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.359533 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerName="dnsmasq-dns" Mar 19 20:27:45 crc kubenswrapper[4799]: E0319 20:27:45.359544 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerName="init" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.359551 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerName="init" Mar 19 20:27:45 crc kubenswrapper[4799]: E0319 20:27:45.359566 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerName="dnsmasq-dns" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.359573 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerName="dnsmasq-dns" Mar 19 20:27:45 crc kubenswrapper[4799]: E0319 20:27:45.359587 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerName="init" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.359593 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerName="init" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.359769 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="3365676b-9535-4e50-93c5-2adb6f8ec4b9" containerName="dnsmasq-dns" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.359778 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ab47f7-ed2d-457a-9b03-31a54cd2f62e" containerName="dnsmasq-dns" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.360453 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.365945 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.366011 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.366176 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.379458 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck"] Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.386774 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.453357 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.453432 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.453499 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltf8\" (UniqueName: \"kubernetes.io/projected/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-kube-api-access-cltf8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.453543 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.555681 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.555796 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.555834 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.555896 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cltf8\" (UniqueName: \"kubernetes.io/projected/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-kube-api-access-cltf8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.567063 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.580461 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.581000 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.584009 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltf8\" (UniqueName: \"kubernetes.io/projected/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-kube-api-access-cltf8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:45 crc kubenswrapper[4799]: I0319 20:27:45.692376 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:27:46 crc kubenswrapper[4799]: W0319 20:27:46.293569 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e6ab146_318d_4a9d_860b_8c1d5c12fab9.slice/crio-f9aab61e78270b150045fd814486e0ce098a048e69dc3fa52d8e852b02a3e6d1 WatchSource:0}: Error finding container f9aab61e78270b150045fd814486e0ce098a048e69dc3fa52d8e852b02a3e6d1: Status 404 returned error can't find the container with id f9aab61e78270b150045fd814486e0ce098a048e69dc3fa52d8e852b02a3e6d1 Mar 19 20:27:46 crc kubenswrapper[4799]: I0319 20:27:46.294777 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck"] Mar 19 20:27:46 crc kubenswrapper[4799]: I0319 20:27:46.346861 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" event={"ID":"6e6ab146-318d-4a9d-860b-8c1d5c12fab9","Type":"ContainerStarted","Data":"f9aab61e78270b150045fd814486e0ce098a048e69dc3fa52d8e852b02a3e6d1"} Mar 19 20:27:49 crc kubenswrapper[4799]: I0319 20:27:49.380271 4799 generic.go:334] "Generic (PLEG): container finished" podID="3117828b-97c2-41b6-a48d-cf7154e2bb71" containerID="de7203a57bf668d5ffed43ffd811481d1d9f2d92ba84af317e6a21af85a2af6b" exitCode=0 Mar 19 20:27:49 crc kubenswrapper[4799]: I0319 20:27:49.380364 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3117828b-97c2-41b6-a48d-cf7154e2bb71","Type":"ContainerDied","Data":"de7203a57bf668d5ffed43ffd811481d1d9f2d92ba84af317e6a21af85a2af6b"} Mar 19 20:27:50 crc kubenswrapper[4799]: I0319 20:27:50.390971 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3117828b-97c2-41b6-a48d-cf7154e2bb71","Type":"ContainerStarted","Data":"ff98642c6a0f6130fbebd49dc860723094d7ae295687dca702ece896d94cd926"} Mar 19 20:27:50 crc kubenswrapper[4799]: I0319 20:27:50.391482 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 20:27:50 crc kubenswrapper[4799]: I0319 20:27:50.429229 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.429213017 podStartE2EDuration="37.429213017s" podCreationTimestamp="2026-03-19 20:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:27:50.420367621 +0000 UTC m=+1348.026320713" watchObservedRunningTime="2026-03-19 20:27:50.429213017 +0000 UTC m=+1348.035166089" Mar 19 20:27:51 crc kubenswrapper[4799]: I0319 20:27:51.415471 4799 generic.go:334] "Generic (PLEG): container finished" podID="1b1c5d7a-7501-4c34-9823-c996a2413399" containerID="15d2c22e7a1e7c75c2fce6b608ee35a01d7491385b7cb01847414489c9613ee9" exitCode=0 Mar 19 20:27:51 crc kubenswrapper[4799]: I0319 20:27:51.415683 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b1c5d7a-7501-4c34-9823-c996a2413399","Type":"ContainerDied","Data":"15d2c22e7a1e7c75c2fce6b608ee35a01d7491385b7cb01847414489c9613ee9"} Mar 19 20:27:56 crc kubenswrapper[4799]: I0319 20:27:56.464474 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" event={"ID":"6e6ab146-318d-4a9d-860b-8c1d5c12fab9","Type":"ContainerStarted","Data":"26bf0e5f4788f17a5e0a9a75f9fcd26a941d3236c3f778964694f0a840378783"} Mar 19 20:27:56 crc kubenswrapper[4799]: I0319 20:27:56.468475 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b1c5d7a-7501-4c34-9823-c996a2413399","Type":"ContainerStarted","Data":"4a8cb7600135361508c05cc930f02856b16bb37c2cb396946b8c8e29ec4b73a2"} Mar 19 20:27:56 crc kubenswrapper[4799]: I0319 20:27:56.468862 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:27:56 crc kubenswrapper[4799]: I0319 20:27:56.489717 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" podStartSLOduration=1.7196000919999999 podStartE2EDuration="11.489690027s" podCreationTimestamp="2026-03-19 20:27:45 +0000 UTC" firstStartedPulling="2026-03-19 20:27:46.296734508 +0000 UTC m=+1343.902687590" lastFinishedPulling="2026-03-19 20:27:56.066824453 +0000 UTC m=+1353.672777525" observedRunningTime="2026-03-19 20:27:56.484265876 +0000 UTC m=+1354.090218978" watchObservedRunningTime="2026-03-19 20:27:56.489690027 +0000 UTC m=+1354.095643129" Mar 19 20:27:56 crc kubenswrapper[4799]: I0319 20:27:56.536612 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.536587803 podStartE2EDuration="41.536587803s" podCreationTimestamp="2026-03-19 20:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:27:56.529894076 +0000 UTC m=+1354.135847158" watchObservedRunningTime="2026-03-19 20:27:56.536587803 +0000 UTC m=+1354.142540915" Mar 19 20:27:58 crc kubenswrapper[4799]: I0319 20:27:58.755742 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:27:58 crc kubenswrapper[4799]: I0319 20:27:58.756044 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:27:58 crc kubenswrapper[4799]: I0319 20:27:58.756084 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:27:58 crc kubenswrapper[4799]: I0319 20:27:58.756785 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"896459b668fd89c49f3549689a461713efd29689b0eb95fb693ea8252573a827"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:27:58 crc kubenswrapper[4799]: I0319 20:27:58.756837 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://896459b668fd89c49f3549689a461713efd29689b0eb95fb693ea8252573a827" gracePeriod=600 Mar 19 20:27:59 crc kubenswrapper[4799]: I0319 20:27:59.508653 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="896459b668fd89c49f3549689a461713efd29689b0eb95fb693ea8252573a827" exitCode=0 Mar 19 20:27:59 crc kubenswrapper[4799]: I0319 20:27:59.509189 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"896459b668fd89c49f3549689a461713efd29689b0eb95fb693ea8252573a827"} Mar 19 20:27:59 crc kubenswrapper[4799]: I0319 20:27:59.509838 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"09af92723404390c6609268ff888c50debb267c3f634bab822b68cda538b8c1c"} Mar 19 20:27:59 crc kubenswrapper[4799]: I0319 20:27:59.510016 4799 scope.go:117] "RemoveContainer" containerID="ebf1a7c33ca2e5a253f33af42655dacb79a5142f188b73cc17c9ab070ccc29c9" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.137769 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565868-bmldh"] Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.139278 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.140976 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.141403 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.142731 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.152651 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-bmldh"] Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.206662 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4mnf\" (UniqueName: \"kubernetes.io/projected/02f4e9ac-a328-489e-976e-79fc3642d88f-kube-api-access-q4mnf\") pod \"auto-csr-approver-29565868-bmldh\" (UID: \"02f4e9ac-a328-489e-976e-79fc3642d88f\") " pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.309036 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4mnf\" (UniqueName: \"kubernetes.io/projected/02f4e9ac-a328-489e-976e-79fc3642d88f-kube-api-access-q4mnf\") pod \"auto-csr-approver-29565868-bmldh\" (UID: \"02f4e9ac-a328-489e-976e-79fc3642d88f\") " pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.334117 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4mnf\" (UniqueName: \"kubernetes.io/projected/02f4e9ac-a328-489e-976e-79fc3642d88f-kube-api-access-q4mnf\") pod \"auto-csr-approver-29565868-bmldh\" (UID: \"02f4e9ac-a328-489e-976e-79fc3642d88f\") " pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:00 crc kubenswrapper[4799]: I0319 20:28:00.465476 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:01 crc kubenswrapper[4799]: I0319 20:28:01.046746 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-bmldh"] Mar 19 20:28:01 crc kubenswrapper[4799]: W0319 20:28:01.054267 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02f4e9ac_a328_489e_976e_79fc3642d88f.slice/crio-23228c6672f48cffd1bc0de9add6e2f2610287eb2f460039d9a162ce427ab468 WatchSource:0}: Error finding container 23228c6672f48cffd1bc0de9add6e2f2610287eb2f460039d9a162ce427ab468: Status 404 returned error can't find the container with id 23228c6672f48cffd1bc0de9add6e2f2610287eb2f460039d9a162ce427ab468 Mar 19 20:28:01 crc kubenswrapper[4799]: I0319 20:28:01.541260 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-bmldh" event={"ID":"02f4e9ac-a328-489e-976e-79fc3642d88f","Type":"ContainerStarted","Data":"23228c6672f48cffd1bc0de9add6e2f2610287eb2f460039d9a162ce427ab468"} Mar 19 20:28:02 crc kubenswrapper[4799]: I0319 20:28:02.556788 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-bmldh" event={"ID":"02f4e9ac-a328-489e-976e-79fc3642d88f","Type":"ContainerStarted","Data":"95bb59016b433dab2f7105e6e7671f83f86e600bec3c153f73944b9e9e6878f3"} Mar 19 20:28:02 crc kubenswrapper[4799]: I0319 20:28:02.594121 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565868-bmldh" podStartSLOduration=1.737429635 podStartE2EDuration="2.594098373s" podCreationTimestamp="2026-03-19 20:28:00 +0000 UTC" firstStartedPulling="2026-03-19 20:28:01.05777263 +0000 UTC m=+1358.663725702" lastFinishedPulling="2026-03-19 20:28:01.914441318 +0000 UTC m=+1359.520394440" observedRunningTime="2026-03-19 20:28:02.585540866 +0000 UTC m=+1360.191493928" watchObservedRunningTime="2026-03-19 20:28:02.594098373 +0000 UTC m=+1360.200051465" Mar 19 20:28:03 crc kubenswrapper[4799]: I0319 20:28:03.572840 4799 generic.go:334] "Generic (PLEG): container finished" podID="02f4e9ac-a328-489e-976e-79fc3642d88f" containerID="95bb59016b433dab2f7105e6e7671f83f86e600bec3c153f73944b9e9e6878f3" exitCode=0 Mar 19 20:28:03 crc kubenswrapper[4799]: I0319 20:28:03.573375 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-bmldh" event={"ID":"02f4e9ac-a328-489e-976e-79fc3642d88f","Type":"ContainerDied","Data":"95bb59016b433dab2f7105e6e7671f83f86e600bec3c153f73944b9e9e6878f3"} Mar 19 20:28:04 crc kubenswrapper[4799]: I0319 20:28:04.339700 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 20:28:04 crc kubenswrapper[4799]: I0319 20:28:04.934765 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.006396 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4mnf\" (UniqueName: \"kubernetes.io/projected/02f4e9ac-a328-489e-976e-79fc3642d88f-kube-api-access-q4mnf\") pod \"02f4e9ac-a328-489e-976e-79fc3642d88f\" (UID: \"02f4e9ac-a328-489e-976e-79fc3642d88f\") " Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.023692 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02f4e9ac-a328-489e-976e-79fc3642d88f-kube-api-access-q4mnf" (OuterVolumeSpecName: "kube-api-access-q4mnf") pod "02f4e9ac-a328-489e-976e-79fc3642d88f" (UID: "02f4e9ac-a328-489e-976e-79fc3642d88f"). InnerVolumeSpecName "kube-api-access-q4mnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.108692 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4mnf\" (UniqueName: \"kubernetes.io/projected/02f4e9ac-a328-489e-976e-79fc3642d88f-kube-api-access-q4mnf\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.598028 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-bmldh" event={"ID":"02f4e9ac-a328-489e-976e-79fc3642d88f","Type":"ContainerDied","Data":"23228c6672f48cffd1bc0de9add6e2f2610287eb2f460039d9a162ce427ab468"} Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.598078 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23228c6672f48cffd1bc0de9add6e2f2610287eb2f460039d9a162ce427ab468" Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.598107 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-bmldh" Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.674562 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-hvpbt"] Mar 19 20:28:05 crc kubenswrapper[4799]: I0319 20:28:05.687859 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-hvpbt"] Mar 19 20:28:06 crc kubenswrapper[4799]: I0319 20:28:06.609843 4799 generic.go:334] "Generic (PLEG): container finished" podID="6e6ab146-318d-4a9d-860b-8c1d5c12fab9" containerID="26bf0e5f4788f17a5e0a9a75f9fcd26a941d3236c3f778964694f0a840378783" exitCode=0 Mar 19 20:28:06 crc kubenswrapper[4799]: I0319 20:28:06.609890 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" event={"ID":"6e6ab146-318d-4a9d-860b-8c1d5c12fab9","Type":"ContainerDied","Data":"26bf0e5f4788f17a5e0a9a75f9fcd26a941d3236c3f778964694f0a840378783"} Mar 19 20:28:07 crc kubenswrapper[4799]: I0319 20:28:07.136213 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0e3ea0-f539-4049-ab71-127f44c0d997" path="/var/lib/kubelet/pods/ed0e3ea0-f539-4049-ab71-127f44c0d997/volumes" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.099973 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.188269 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cltf8\" (UniqueName: \"kubernetes.io/projected/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-kube-api-access-cltf8\") pod \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.188352 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-repo-setup-combined-ca-bundle\") pod \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.188420 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-inventory\") pod \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.188454 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-ssh-key-openstack-edpm-ipam\") pod \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\" (UID: \"6e6ab146-318d-4a9d-860b-8c1d5c12fab9\") " Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.195048 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-kube-api-access-cltf8" (OuterVolumeSpecName: "kube-api-access-cltf8") pod "6e6ab146-318d-4a9d-860b-8c1d5c12fab9" (UID: "6e6ab146-318d-4a9d-860b-8c1d5c12fab9"). InnerVolumeSpecName "kube-api-access-cltf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.197620 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6e6ab146-318d-4a9d-860b-8c1d5c12fab9" (UID: "6e6ab146-318d-4a9d-860b-8c1d5c12fab9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.221562 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6e6ab146-318d-4a9d-860b-8c1d5c12fab9" (UID: "6e6ab146-318d-4a9d-860b-8c1d5c12fab9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.222034 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-inventory" (OuterVolumeSpecName: "inventory") pod "6e6ab146-318d-4a9d-860b-8c1d5c12fab9" (UID: "6e6ab146-318d-4a9d-860b-8c1d5c12fab9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.291030 4799 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.291097 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.291112 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.291125 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cltf8\" (UniqueName: \"kubernetes.io/projected/6e6ab146-318d-4a9d-860b-8c1d5c12fab9-kube-api-access-cltf8\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.633700 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" event={"ID":"6e6ab146-318d-4a9d-860b-8c1d5c12fab9","Type":"ContainerDied","Data":"f9aab61e78270b150045fd814486e0ce098a048e69dc3fa52d8e852b02a3e6d1"} Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.634023 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9aab61e78270b150045fd814486e0ce098a048e69dc3fa52d8e852b02a3e6d1" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.633798 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.724797 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f"] Mar 19 20:28:08 crc kubenswrapper[4799]: E0319 20:28:08.725162 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02f4e9ac-a328-489e-976e-79fc3642d88f" containerName="oc" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.725173 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f4e9ac-a328-489e-976e-79fc3642d88f" containerName="oc" Mar 19 20:28:08 crc kubenswrapper[4799]: E0319 20:28:08.725209 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6ab146-318d-4a9d-860b-8c1d5c12fab9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.725216 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6ab146-318d-4a9d-860b-8c1d5c12fab9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.725454 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6ab146-318d-4a9d-860b-8c1d5c12fab9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.725470 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="02f4e9ac-a328-489e-976e-79fc3642d88f" containerName="oc" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.726145 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.727861 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.728320 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.728320 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.728461 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.753708 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f"] Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.805408 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxc2p\" (UniqueName: \"kubernetes.io/projected/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-kube-api-access-dxc2p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.805512 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.805543 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.907492 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxc2p\" (UniqueName: \"kubernetes.io/projected/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-kube-api-access-dxc2p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.907599 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.907627 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.914257 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.929856 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:08 crc kubenswrapper[4799]: I0319 20:28:08.935901 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxc2p\" (UniqueName: \"kubernetes.io/projected/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-kube-api-access-dxc2p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ngr7f\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:09 crc kubenswrapper[4799]: I0319 20:28:09.048411 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:09 crc kubenswrapper[4799]: W0319 20:28:09.615607 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27e3e050_d2b1_4bc2_b93c_4258f8f4a86d.slice/crio-fd3ccc132fa3a58320edd8ac1a1e1a5dbed973629f477ff7bfd69bbd00f79c50 WatchSource:0}: Error finding container fd3ccc132fa3a58320edd8ac1a1e1a5dbed973629f477ff7bfd69bbd00f79c50: Status 404 returned error can't find the container with id fd3ccc132fa3a58320edd8ac1a1e1a5dbed973629f477ff7bfd69bbd00f79c50 Mar 19 20:28:09 crc kubenswrapper[4799]: I0319 20:28:09.623190 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f"] Mar 19 20:28:09 crc kubenswrapper[4799]: I0319 20:28:09.644863 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" event={"ID":"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d","Type":"ContainerStarted","Data":"fd3ccc132fa3a58320edd8ac1a1e1a5dbed973629f477ff7bfd69bbd00f79c50"} Mar 19 20:28:10 crc kubenswrapper[4799]: I0319 20:28:10.657774 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" event={"ID":"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d","Type":"ContainerStarted","Data":"6ebd7b3212800eaa975545781130ce6da8bff2eb08019568680d113b141e2159"} Mar 19 20:28:10 crc kubenswrapper[4799]: I0319 20:28:10.685190 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" podStartSLOduration=2.125023821 podStartE2EDuration="2.685161875s" podCreationTimestamp="2026-03-19 20:28:08 +0000 UTC" firstStartedPulling="2026-03-19 20:28:09.619147539 +0000 UTC m=+1367.225100621" lastFinishedPulling="2026-03-19 20:28:10.179285563 +0000 UTC m=+1367.785238675" observedRunningTime="2026-03-19 20:28:10.676225357 +0000 UTC m=+1368.282178439" watchObservedRunningTime="2026-03-19 20:28:10.685161875 +0000 UTC m=+1368.291114987" Mar 19 20:28:13 crc kubenswrapper[4799]: I0319 20:28:13.701942 4799 generic.go:334] "Generic (PLEG): container finished" podID="27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" containerID="6ebd7b3212800eaa975545781130ce6da8bff2eb08019568680d113b141e2159" exitCode=0 Mar 19 20:28:13 crc kubenswrapper[4799]: I0319 20:28:13.702063 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" event={"ID":"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d","Type":"ContainerDied","Data":"6ebd7b3212800eaa975545781130ce6da8bff2eb08019568680d113b141e2159"} Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.228553 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.349826 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxc2p\" (UniqueName: \"kubernetes.io/projected/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-kube-api-access-dxc2p\") pod \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.349873 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-ssh-key-openstack-edpm-ipam\") pod \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.349976 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-inventory\") pod \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\" (UID: \"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d\") " Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.357612 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-kube-api-access-dxc2p" (OuterVolumeSpecName: "kube-api-access-dxc2p") pod "27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" (UID: "27e3e050-d2b1-4bc2-b93c-4258f8f4a86d"). InnerVolumeSpecName "kube-api-access-dxc2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.407120 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" (UID: "27e3e050-d2b1-4bc2-b93c-4258f8f4a86d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.410264 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-inventory" (OuterVolumeSpecName: "inventory") pod "27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" (UID: "27e3e050-d2b1-4bc2-b93c-4258f8f4a86d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.452579 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxc2p\" (UniqueName: \"kubernetes.io/projected/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-kube-api-access-dxc2p\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.452635 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.452654 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e3e050-d2b1-4bc2-b93c-4258f8f4a86d-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.730436 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" event={"ID":"27e3e050-d2b1-4bc2-b93c-4258f8f4a86d","Type":"ContainerDied","Data":"fd3ccc132fa3a58320edd8ac1a1e1a5dbed973629f477ff7bfd69bbd00f79c50"} Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.730498 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd3ccc132fa3a58320edd8ac1a1e1a5dbed973629f477ff7bfd69bbd00f79c50" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.731031 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ngr7f" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.823885 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9"] Mar 19 20:28:15 crc kubenswrapper[4799]: E0319 20:28:15.824551 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.824606 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.824997 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e3e050-d2b1-4bc2-b93c-4258f8f4a86d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.826043 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.831329 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.833024 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.833568 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.834041 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.851372 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9"] Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.926873 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.964560 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.964821 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.964866 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtk25\" (UniqueName: \"kubernetes.io/projected/8befdbab-e306-4827-98a7-f042c02380ae-kube-api-access-mtk25\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:15 crc kubenswrapper[4799]: I0319 20:28:15.964999 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.066880 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.066931 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtk25\" (UniqueName: \"kubernetes.io/projected/8befdbab-e306-4827-98a7-f042c02380ae-kube-api-access-mtk25\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.067058 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.067166 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.070882 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.073763 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.080101 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.091769 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtk25\" (UniqueName: \"kubernetes.io/projected/8befdbab-e306-4827-98a7-f042c02380ae-kube-api-access-mtk25\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.155781 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.686743 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9"] Mar 19 20:28:16 crc kubenswrapper[4799]: I0319 20:28:16.742531 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" event={"ID":"8befdbab-e306-4827-98a7-f042c02380ae","Type":"ContainerStarted","Data":"bce8e702116f0cb6cf87bbb8f6292ddc547a5e23dfb6cace7eda97079aaec414"} Mar 19 20:28:17 crc kubenswrapper[4799]: I0319 20:28:17.760538 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" event={"ID":"8befdbab-e306-4827-98a7-f042c02380ae","Type":"ContainerStarted","Data":"eb34cf8298f345ce8fb98e37c591cbd9984d9335d0c1ecec54e7119d7b5e63a1"} Mar 19 20:28:17 crc kubenswrapper[4799]: I0319 20:28:17.785181 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" podStartSLOduration=2.377706444 podStartE2EDuration="2.785160738s" podCreationTimestamp="2026-03-19 20:28:15 +0000 UTC" firstStartedPulling="2026-03-19 20:28:16.696420143 +0000 UTC m=+1374.302373225" lastFinishedPulling="2026-03-19 20:28:17.103874457 +0000 UTC m=+1374.709827519" observedRunningTime="2026-03-19 20:28:17.782695799 +0000 UTC m=+1375.388648931" watchObservedRunningTime="2026-03-19 20:28:17.785160738 +0000 UTC m=+1375.391113820" Mar 19 20:28:56 crc kubenswrapper[4799]: I0319 20:28:56.438111 4799 scope.go:117] "RemoveContainer" containerID="7d2a71c97bbe34456f4119637ade92cc9641790c4f95dc448432908cbc0bf4b0" Mar 19 20:28:56 crc kubenswrapper[4799]: I0319 20:28:56.476319 4799 scope.go:117] "RemoveContainer" containerID="9578059facd4d2e677a536d42c8c66afd21b7edf18ff670903facc65d9a754a8" Mar 19 20:28:56 crc kubenswrapper[4799]: I0319 20:28:56.563692 4799 scope.go:117] "RemoveContainer" containerID="5676b690020210ecb247cd185ae33755305c7ceb5fc992bfa9b28ea2f8c95f5a" Mar 19 20:28:56 crc kubenswrapper[4799]: I0319 20:28:56.596480 4799 scope.go:117] "RemoveContainer" containerID="ef0e2f99cd80bd9cf55165de8414ab0f5e5d043313cad0c04973c3aa57a69653" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.756061 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jfmtv"] Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.760327 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.766927 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfmtv"] Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.797621 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-utilities\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.797802 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2klcm\" (UniqueName: \"kubernetes.io/projected/93a68903-0734-4c9b-a1db-048404aa625d-kube-api-access-2klcm\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.797877 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-catalog-content\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.899518 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-utilities\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.899611 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2klcm\" (UniqueName: \"kubernetes.io/projected/93a68903-0734-4c9b-a1db-048404aa625d-kube-api-access-2klcm\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.899660 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-catalog-content\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.900204 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-catalog-content\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.900572 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-utilities\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:09 crc kubenswrapper[4799]: I0319 20:29:09.924370 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2klcm\" (UniqueName: \"kubernetes.io/projected/93a68903-0734-4c9b-a1db-048404aa625d-kube-api-access-2klcm\") pod \"redhat-operators-jfmtv\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:10 crc kubenswrapper[4799]: I0319 20:29:10.103017 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:10 crc kubenswrapper[4799]: I0319 20:29:10.605523 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfmtv"] Mar 19 20:29:11 crc kubenswrapper[4799]: I0319 20:29:11.482212 4799 generic.go:334] "Generic (PLEG): container finished" podID="93a68903-0734-4c9b-a1db-048404aa625d" containerID="11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c" exitCode=0 Mar 19 20:29:11 crc kubenswrapper[4799]: I0319 20:29:11.482297 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerDied","Data":"11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c"} Mar 19 20:29:11 crc kubenswrapper[4799]: I0319 20:29:11.482622 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerStarted","Data":"925a7915ad8416edaa4634caafa1ab13504eafb7f61092ea0011957bcba1e785"} Mar 19 20:29:13 crc kubenswrapper[4799]: I0319 20:29:13.510346 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerStarted","Data":"a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e"} Mar 19 20:29:15 crc kubenswrapper[4799]: I0319 20:29:15.538631 4799 generic.go:334] "Generic (PLEG): container finished" podID="93a68903-0734-4c9b-a1db-048404aa625d" containerID="a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e" exitCode=0 Mar 19 20:29:15 crc kubenswrapper[4799]: I0319 20:29:15.539119 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerDied","Data":"a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e"} Mar 19 20:29:16 crc kubenswrapper[4799]: I0319 20:29:16.555068 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerStarted","Data":"586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5"} Mar 19 20:29:16 crc kubenswrapper[4799]: I0319 20:29:16.588001 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jfmtv" podStartSLOduration=3.006305367 podStartE2EDuration="7.587970208s" podCreationTimestamp="2026-03-19 20:29:09 +0000 UTC" firstStartedPulling="2026-03-19 20:29:11.486557391 +0000 UTC m=+1429.092510493" lastFinishedPulling="2026-03-19 20:29:16.068222232 +0000 UTC m=+1433.674175334" observedRunningTime="2026-03-19 20:29:16.577686773 +0000 UTC m=+1434.183639875" watchObservedRunningTime="2026-03-19 20:29:16.587970208 +0000 UTC m=+1434.193923310" Mar 19 20:29:20 crc kubenswrapper[4799]: I0319 20:29:20.103625 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:20 crc kubenswrapper[4799]: I0319 20:29:20.104345 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:21 crc kubenswrapper[4799]: I0319 20:29:21.162279 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jfmtv" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="registry-server" probeResult="failure" output=< Mar 19 20:29:21 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:29:21 crc kubenswrapper[4799]: > Mar 19 20:29:30 crc kubenswrapper[4799]: I0319 20:29:30.178744 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:30 crc kubenswrapper[4799]: I0319 20:29:30.250582 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:30 crc kubenswrapper[4799]: I0319 20:29:30.431054 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfmtv"] Mar 19 20:29:31 crc kubenswrapper[4799]: I0319 20:29:31.765539 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jfmtv" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="registry-server" containerID="cri-o://586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5" gracePeriod=2 Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.242984 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.305184 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-utilities\") pod \"93a68903-0734-4c9b-a1db-048404aa625d\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.305588 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2klcm\" (UniqueName: \"kubernetes.io/projected/93a68903-0734-4c9b-a1db-048404aa625d-kube-api-access-2klcm\") pod \"93a68903-0734-4c9b-a1db-048404aa625d\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.305807 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-catalog-content\") pod \"93a68903-0734-4c9b-a1db-048404aa625d\" (UID: \"93a68903-0734-4c9b-a1db-048404aa625d\") " Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.308279 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-utilities" (OuterVolumeSpecName: "utilities") pod "93a68903-0734-4c9b-a1db-048404aa625d" (UID: "93a68903-0734-4c9b-a1db-048404aa625d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.313295 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a68903-0734-4c9b-a1db-048404aa625d-kube-api-access-2klcm" (OuterVolumeSpecName: "kube-api-access-2klcm") pod "93a68903-0734-4c9b-a1db-048404aa625d" (UID: "93a68903-0734-4c9b-a1db-048404aa625d"). InnerVolumeSpecName "kube-api-access-2klcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.408789 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.408845 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2klcm\" (UniqueName: \"kubernetes.io/projected/93a68903-0734-4c9b-a1db-048404aa625d-kube-api-access-2klcm\") on node \"crc\" DevicePath \"\"" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.473851 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93a68903-0734-4c9b-a1db-048404aa625d" (UID: "93a68903-0734-4c9b-a1db-048404aa625d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.510887 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a68903-0734-4c9b-a1db-048404aa625d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.780906 4799 generic.go:334] "Generic (PLEG): container finished" podID="93a68903-0734-4c9b-a1db-048404aa625d" containerID="586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5" exitCode=0 Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.780975 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerDied","Data":"586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5"} Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.780989 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfmtv" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.781030 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfmtv" event={"ID":"93a68903-0734-4c9b-a1db-048404aa625d","Type":"ContainerDied","Data":"925a7915ad8416edaa4634caafa1ab13504eafb7f61092ea0011957bcba1e785"} Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.781083 4799 scope.go:117] "RemoveContainer" containerID="586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.830675 4799 scope.go:117] "RemoveContainer" containerID="a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.843686 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfmtv"] Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.858535 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jfmtv"] Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.868730 4799 scope.go:117] "RemoveContainer" containerID="11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.943631 4799 scope.go:117] "RemoveContainer" containerID="586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5" Mar 19 20:29:32 crc kubenswrapper[4799]: E0319 20:29:32.944215 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5\": container with ID starting with 586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5 not found: ID does not exist" containerID="586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.944268 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5"} err="failed to get container status \"586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5\": rpc error: code = NotFound desc = could not find container \"586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5\": container with ID starting with 586c2ee5f67449f6d4829425361671b29224efcb7bee937b848795042e78c2b5 not found: ID does not exist" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.944293 4799 scope.go:117] "RemoveContainer" containerID="a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e" Mar 19 20:29:32 crc kubenswrapper[4799]: E0319 20:29:32.944694 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e\": container with ID starting with a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e not found: ID does not exist" containerID="a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.944733 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e"} err="failed to get container status \"a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e\": rpc error: code = NotFound desc = could not find container \"a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e\": container with ID starting with a6fbcb7d8cc6da4034e4b7d12eeab954b73856dc22d550958e2f9c0dd31f055e not found: ID does not exist" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.944756 4799 scope.go:117] "RemoveContainer" containerID="11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c" Mar 19 20:29:32 crc kubenswrapper[4799]: E0319 20:29:32.945137 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c\": container with ID starting with 11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c not found: ID does not exist" containerID="11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c" Mar 19 20:29:32 crc kubenswrapper[4799]: I0319 20:29:32.945175 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c"} err="failed to get container status \"11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c\": rpc error: code = NotFound desc = could not find container \"11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c\": container with ID starting with 11e7fb1cfdc11fc213a1f24280187ced10d0a552dedbcff7afe07478b104b83c not found: ID does not exist" Mar 19 20:29:33 crc kubenswrapper[4799]: I0319 20:29:33.136935 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a68903-0734-4c9b-a1db-048404aa625d" path="/var/lib/kubelet/pods/93a68903-0734-4c9b-a1db-048404aa625d/volumes" Mar 19 20:29:56 crc kubenswrapper[4799]: I0319 20:29:56.740096 4799 scope.go:117] "RemoveContainer" containerID="46e9a2e2b408783696a5ac06152210d1cd5ac8edd8f6a8b42c724202356e5a2f" Mar 19 20:29:56 crc kubenswrapper[4799]: I0319 20:29:56.800272 4799 scope.go:117] "RemoveContainer" containerID="3e62a82de3b89bd879aa78f98e9b11ed16982ea312eb10fa4da0c5e2af39e3bb" Mar 19 20:29:56 crc kubenswrapper[4799]: I0319 20:29:56.850057 4799 scope.go:117] "RemoveContainer" containerID="6c7e1c1307ea4516951caf7219f73245eefc35a6e86a8e373445f4d7d411f60d" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.162780 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565870-p96cm"] Mar 19 20:30:00 crc kubenswrapper[4799]: E0319 20:30:00.164412 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="registry-server" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.164443 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="registry-server" Mar 19 20:30:00 crc kubenswrapper[4799]: E0319 20:30:00.164505 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="extract-utilities" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.164519 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="extract-utilities" Mar 19 20:30:00 crc kubenswrapper[4799]: E0319 20:30:00.164556 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="extract-content" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.164572 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="extract-content" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.164992 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a68903-0734-4c9b-a1db-048404aa625d" containerName="registry-server" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.166196 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.170006 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.170290 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.171611 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.176436 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2"] Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.178617 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.181480 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.182304 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.186045 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-p96cm"] Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.219225 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2"] Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.236095 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67210496-c7b2-4ab5-aae0-23e19f947c67-secret-volume\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.236584 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrw7\" (UniqueName: \"kubernetes.io/projected/2b3f2053-f94c-40f3-876d-0e3b8ca33c77-kube-api-access-knrw7\") pod \"auto-csr-approver-29565870-p96cm\" (UID: \"2b3f2053-f94c-40f3-876d-0e3b8ca33c77\") " pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.236694 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67210496-c7b2-4ab5-aae0-23e19f947c67-config-volume\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.236927 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7rl\" (UniqueName: \"kubernetes.io/projected/67210496-c7b2-4ab5-aae0-23e19f947c67-kube-api-access-8q7rl\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.339786 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67210496-c7b2-4ab5-aae0-23e19f947c67-secret-volume\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.339998 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrw7\" (UniqueName: \"kubernetes.io/projected/2b3f2053-f94c-40f3-876d-0e3b8ca33c77-kube-api-access-knrw7\") pod \"auto-csr-approver-29565870-p96cm\" (UID: \"2b3f2053-f94c-40f3-876d-0e3b8ca33c77\") " pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.340206 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67210496-c7b2-4ab5-aae0-23e19f947c67-config-volume\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.340329 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7rl\" (UniqueName: \"kubernetes.io/projected/67210496-c7b2-4ab5-aae0-23e19f947c67-kube-api-access-8q7rl\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.342432 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67210496-c7b2-4ab5-aae0-23e19f947c67-config-volume\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.348711 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67210496-c7b2-4ab5-aae0-23e19f947c67-secret-volume\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.359568 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrw7\" (UniqueName: \"kubernetes.io/projected/2b3f2053-f94c-40f3-876d-0e3b8ca33c77-kube-api-access-knrw7\") pod \"auto-csr-approver-29565870-p96cm\" (UID: \"2b3f2053-f94c-40f3-876d-0e3b8ca33c77\") " pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.373350 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7rl\" (UniqueName: \"kubernetes.io/projected/67210496-c7b2-4ab5-aae0-23e19f947c67-kube-api-access-8q7rl\") pod \"collect-profiles-29565870-zdhj2\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.494130 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:00 crc kubenswrapper[4799]: I0319 20:30:00.504165 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:01 crc kubenswrapper[4799]: I0319 20:30:01.034589 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-p96cm"] Mar 19 20:30:01 crc kubenswrapper[4799]: I0319 20:30:01.041605 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:30:01 crc kubenswrapper[4799]: W0319 20:30:01.115709 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67210496_c7b2_4ab5_aae0_23e19f947c67.slice/crio-382d9420b96dd6262a050ec35ad2a568a1fa79f877c11585086236ee918e80b7 WatchSource:0}: Error finding container 382d9420b96dd6262a050ec35ad2a568a1fa79f877c11585086236ee918e80b7: Status 404 returned error can't find the container with id 382d9420b96dd6262a050ec35ad2a568a1fa79f877c11585086236ee918e80b7 Mar 19 20:30:01 crc kubenswrapper[4799]: I0319 20:30:01.131284 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-p96cm" event={"ID":"2b3f2053-f94c-40f3-876d-0e3b8ca33c77","Type":"ContainerStarted","Data":"486f0c4f574981351b30e706d9b7ead94609b43772223c648d8c18b4a4c317c6"} Mar 19 20:30:01 crc kubenswrapper[4799]: I0319 20:30:01.131324 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2"] Mar 19 20:30:01 crc kubenswrapper[4799]: I0319 20:30:01.131745 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" event={"ID":"67210496-c7b2-4ab5-aae0-23e19f947c67","Type":"ContainerStarted","Data":"382d9420b96dd6262a050ec35ad2a568a1fa79f877c11585086236ee918e80b7"} Mar 19 20:30:02 crc kubenswrapper[4799]: I0319 20:30:02.147064 4799 generic.go:334] "Generic (PLEG): container finished" podID="67210496-c7b2-4ab5-aae0-23e19f947c67" containerID="bc7a0bc263571ef8c2f0a8a58e18f261ba8239ed3d8ebe1d3ce04936e7b059be" exitCode=0 Mar 19 20:30:02 crc kubenswrapper[4799]: I0319 20:30:02.147167 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" event={"ID":"67210496-c7b2-4ab5-aae0-23e19f947c67","Type":"ContainerDied","Data":"bc7a0bc263571ef8c2f0a8a58e18f261ba8239ed3d8ebe1d3ce04936e7b059be"} Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.521521 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.610001 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67210496-c7b2-4ab5-aae0-23e19f947c67-config-volume\") pod \"67210496-c7b2-4ab5-aae0-23e19f947c67\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.610064 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67210496-c7b2-4ab5-aae0-23e19f947c67-secret-volume\") pod \"67210496-c7b2-4ab5-aae0-23e19f947c67\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.610197 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q7rl\" (UniqueName: \"kubernetes.io/projected/67210496-c7b2-4ab5-aae0-23e19f947c67-kube-api-access-8q7rl\") pod \"67210496-c7b2-4ab5-aae0-23e19f947c67\" (UID: \"67210496-c7b2-4ab5-aae0-23e19f947c67\") " Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.610818 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67210496-c7b2-4ab5-aae0-23e19f947c67-config-volume" (OuterVolumeSpecName: "config-volume") pod "67210496-c7b2-4ab5-aae0-23e19f947c67" (UID: "67210496-c7b2-4ab5-aae0-23e19f947c67"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.611556 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67210496-c7b2-4ab5-aae0-23e19f947c67-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.615822 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67210496-c7b2-4ab5-aae0-23e19f947c67-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67210496-c7b2-4ab5-aae0-23e19f947c67" (UID: "67210496-c7b2-4ab5-aae0-23e19f947c67"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.616369 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67210496-c7b2-4ab5-aae0-23e19f947c67-kube-api-access-8q7rl" (OuterVolumeSpecName: "kube-api-access-8q7rl") pod "67210496-c7b2-4ab5-aae0-23e19f947c67" (UID: "67210496-c7b2-4ab5-aae0-23e19f947c67"). InnerVolumeSpecName "kube-api-access-8q7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.713236 4799 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67210496-c7b2-4ab5-aae0-23e19f947c67-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:03 crc kubenswrapper[4799]: I0319 20:30:03.713276 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q7rl\" (UniqueName: \"kubernetes.io/projected/67210496-c7b2-4ab5-aae0-23e19f947c67-kube-api-access-8q7rl\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:04 crc kubenswrapper[4799]: I0319 20:30:04.176681 4799 generic.go:334] "Generic (PLEG): container finished" podID="2b3f2053-f94c-40f3-876d-0e3b8ca33c77" containerID="d92a3732a5ec0c6a9848dc90b0df404e0a9525a581adf9b7b9de5b97f747ecca" exitCode=0 Mar 19 20:30:04 crc kubenswrapper[4799]: I0319 20:30:04.176792 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-p96cm" event={"ID":"2b3f2053-f94c-40f3-876d-0e3b8ca33c77","Type":"ContainerDied","Data":"d92a3732a5ec0c6a9848dc90b0df404e0a9525a581adf9b7b9de5b97f747ecca"} Mar 19 20:30:04 crc kubenswrapper[4799]: I0319 20:30:04.179174 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" event={"ID":"67210496-c7b2-4ab5-aae0-23e19f947c67","Type":"ContainerDied","Data":"382d9420b96dd6262a050ec35ad2a568a1fa79f877c11585086236ee918e80b7"} Mar 19 20:30:04 crc kubenswrapper[4799]: I0319 20:30:04.179550 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="382d9420b96dd6262a050ec35ad2a568a1fa79f877c11585086236ee918e80b7" Mar 19 20:30:04 crc kubenswrapper[4799]: I0319 20:30:04.179260 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2" Mar 19 20:30:05 crc kubenswrapper[4799]: I0319 20:30:05.591176 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:05 crc kubenswrapper[4799]: I0319 20:30:05.674437 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knrw7\" (UniqueName: \"kubernetes.io/projected/2b3f2053-f94c-40f3-876d-0e3b8ca33c77-kube-api-access-knrw7\") pod \"2b3f2053-f94c-40f3-876d-0e3b8ca33c77\" (UID: \"2b3f2053-f94c-40f3-876d-0e3b8ca33c77\") " Mar 19 20:30:05 crc kubenswrapper[4799]: I0319 20:30:05.681528 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3f2053-f94c-40f3-876d-0e3b8ca33c77-kube-api-access-knrw7" (OuterVolumeSpecName: "kube-api-access-knrw7") pod "2b3f2053-f94c-40f3-876d-0e3b8ca33c77" (UID: "2b3f2053-f94c-40f3-876d-0e3b8ca33c77"). InnerVolumeSpecName "kube-api-access-knrw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:30:05 crc kubenswrapper[4799]: I0319 20:30:05.776445 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knrw7\" (UniqueName: \"kubernetes.io/projected/2b3f2053-f94c-40f3-876d-0e3b8ca33c77-kube-api-access-knrw7\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:06 crc kubenswrapper[4799]: I0319 20:30:06.204980 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-p96cm" event={"ID":"2b3f2053-f94c-40f3-876d-0e3b8ca33c77","Type":"ContainerDied","Data":"486f0c4f574981351b30e706d9b7ead94609b43772223c648d8c18b4a4c317c6"} Mar 19 20:30:06 crc kubenswrapper[4799]: I0319 20:30:06.205050 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486f0c4f574981351b30e706d9b7ead94609b43772223c648d8c18b4a4c317c6" Mar 19 20:30:06 crc kubenswrapper[4799]: I0319 20:30:06.205071 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-p96cm" Mar 19 20:30:06 crc kubenswrapper[4799]: I0319 20:30:06.672744 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-nkqck"] Mar 19 20:30:06 crc kubenswrapper[4799]: I0319 20:30:06.681528 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-nkqck"] Mar 19 20:30:07 crc kubenswrapper[4799]: I0319 20:30:07.147667 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f8c9fc-4f87-4da0-aa76-908eb7e729d6" path="/var/lib/kubelet/pods/28f8c9fc-4f87-4da0-aa76-908eb7e729d6/volumes" Mar 19 20:30:28 crc kubenswrapper[4799]: I0319 20:30:28.756092 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:30:28 crc kubenswrapper[4799]: I0319 20:30:28.756864 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:30:57 crc kubenswrapper[4799]: I0319 20:30:57.019370 4799 scope.go:117] "RemoveContainer" containerID="153ce1bd37302784eddb1b73676555827f74959125921a6a3148b8de7d0946c2" Mar 19 20:30:58 crc kubenswrapper[4799]: I0319 20:30:58.756516 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:30:58 crc kubenswrapper[4799]: I0319 20:30:58.757076 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:31:10 crc kubenswrapper[4799]: I0319 20:31:10.977728 4799 generic.go:334] "Generic (PLEG): container finished" podID="8befdbab-e306-4827-98a7-f042c02380ae" containerID="eb34cf8298f345ce8fb98e37c591cbd9984d9335d0c1ecec54e7119d7b5e63a1" exitCode=0 Mar 19 20:31:10 crc kubenswrapper[4799]: I0319 20:31:10.977803 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" event={"ID":"8befdbab-e306-4827-98a7-f042c02380ae","Type":"ContainerDied","Data":"eb34cf8298f345ce8fb98e37c591cbd9984d9335d0c1ecec54e7119d7b5e63a1"} Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.602227 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.634512 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-bootstrap-combined-ca-bundle\") pod \"8befdbab-e306-4827-98a7-f042c02380ae\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.634587 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtk25\" (UniqueName: \"kubernetes.io/projected/8befdbab-e306-4827-98a7-f042c02380ae-kube-api-access-mtk25\") pod \"8befdbab-e306-4827-98a7-f042c02380ae\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.635876 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-inventory\") pod \"8befdbab-e306-4827-98a7-f042c02380ae\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.636025 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-ssh-key-openstack-edpm-ipam\") pod \"8befdbab-e306-4827-98a7-f042c02380ae\" (UID: \"8befdbab-e306-4827-98a7-f042c02380ae\") " Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.645188 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8befdbab-e306-4827-98a7-f042c02380ae-kube-api-access-mtk25" (OuterVolumeSpecName: "kube-api-access-mtk25") pod "8befdbab-e306-4827-98a7-f042c02380ae" (UID: "8befdbab-e306-4827-98a7-f042c02380ae"). InnerVolumeSpecName "kube-api-access-mtk25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.652753 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8befdbab-e306-4827-98a7-f042c02380ae" (UID: "8befdbab-e306-4827-98a7-f042c02380ae"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.679328 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-inventory" (OuterVolumeSpecName: "inventory") pod "8befdbab-e306-4827-98a7-f042c02380ae" (UID: "8befdbab-e306-4827-98a7-f042c02380ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.680679 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8befdbab-e306-4827-98a7-f042c02380ae" (UID: "8befdbab-e306-4827-98a7-f042c02380ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.738743 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.738796 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.738812 4799 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8befdbab-e306-4827-98a7-f042c02380ae-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:12 crc kubenswrapper[4799]: I0319 20:31:12.738825 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtk25\" (UniqueName: \"kubernetes.io/projected/8befdbab-e306-4827-98a7-f042c02380ae-kube-api-access-mtk25\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.005185 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" event={"ID":"8befdbab-e306-4827-98a7-f042c02380ae","Type":"ContainerDied","Data":"bce8e702116f0cb6cf87bbb8f6292ddc547a5e23dfb6cace7eda97079aaec414"} Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.005260 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce8e702116f0cb6cf87bbb8f6292ddc547a5e23dfb6cace7eda97079aaec414" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.005729 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.151253 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll"] Mar 19 20:31:13 crc kubenswrapper[4799]: E0319 20:31:13.151992 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67210496-c7b2-4ab5-aae0-23e19f947c67" containerName="collect-profiles" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.152031 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="67210496-c7b2-4ab5-aae0-23e19f947c67" containerName="collect-profiles" Mar 19 20:31:13 crc kubenswrapper[4799]: E0319 20:31:13.152076 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3f2053-f94c-40f3-876d-0e3b8ca33c77" containerName="oc" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.152090 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3f2053-f94c-40f3-876d-0e3b8ca33c77" containerName="oc" Mar 19 20:31:13 crc kubenswrapper[4799]: E0319 20:31:13.152131 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8befdbab-e306-4827-98a7-f042c02380ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.152145 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8befdbab-e306-4827-98a7-f042c02380ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.152557 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3f2053-f94c-40f3-876d-0e3b8ca33c77" containerName="oc" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.152636 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="67210496-c7b2-4ab5-aae0-23e19f947c67" containerName="collect-profiles" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.152665 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8befdbab-e306-4827-98a7-f042c02380ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.153792 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.157130 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.157338 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.157508 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.158015 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.176549 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll"] Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.261167 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.262117 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chtd\" (UniqueName: \"kubernetes.io/projected/8e141b61-4ffc-407f-a554-58f8176b1b18-kube-api-access-4chtd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.262262 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.364110 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.364357 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chtd\" (UniqueName: \"kubernetes.io/projected/8e141b61-4ffc-407f-a554-58f8176b1b18-kube-api-access-4chtd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.364415 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.370515 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.371500 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.390260 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chtd\" (UniqueName: \"kubernetes.io/projected/8e141b61-4ffc-407f-a554-58f8176b1b18-kube-api-access-4chtd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vgxll\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:13 crc kubenswrapper[4799]: I0319 20:31:13.485428 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:31:14 crc kubenswrapper[4799]: I0319 20:31:14.055397 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll"] Mar 19 20:31:15 crc kubenswrapper[4799]: I0319 20:31:15.030560 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" event={"ID":"8e141b61-4ffc-407f-a554-58f8176b1b18","Type":"ContainerStarted","Data":"84ce320ab019464e9ce834dd54b2a23a01417aec1ea26b3aab21cbb69a07301e"} Mar 19 20:31:16 crc kubenswrapper[4799]: I0319 20:31:16.057977 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" event={"ID":"8e141b61-4ffc-407f-a554-58f8176b1b18","Type":"ContainerStarted","Data":"fa004322c9b929ba298b5967cea13e3d4e73d676eae634994ef869a13a0f3716"} Mar 19 20:31:16 crc kubenswrapper[4799]: I0319 20:31:16.084877 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" podStartSLOduration=1.586000065 podStartE2EDuration="3.084841926s" podCreationTimestamp="2026-03-19 20:31:13 +0000 UTC" firstStartedPulling="2026-03-19 20:31:14.06546928 +0000 UTC m=+1551.671422352" lastFinishedPulling="2026-03-19 20:31:15.564311101 +0000 UTC m=+1553.170264213" observedRunningTime="2026-03-19 20:31:16.079925795 +0000 UTC m=+1553.685878877" watchObservedRunningTime="2026-03-19 20:31:16.084841926 +0000 UTC m=+1553.690795008" Mar 19 20:31:19 crc kubenswrapper[4799]: I0319 20:31:19.894460 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8djb"] Mar 19 20:31:19 crc kubenswrapper[4799]: I0319 20:31:19.898405 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:19 crc kubenswrapper[4799]: I0319 20:31:19.932849 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8djb"] Mar 19 20:31:19 crc kubenswrapper[4799]: I0319 20:31:19.939702 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-catalog-content\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:19 crc kubenswrapper[4799]: I0319 20:31:19.939806 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-utilities\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:19 crc kubenswrapper[4799]: I0319 20:31:19.939837 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhm9z\" (UniqueName: \"kubernetes.io/projected/0d036633-8176-4398-b267-6dbaf9e9d422-kube-api-access-zhm9z\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.041179 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-catalog-content\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.041300 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-utilities\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.041333 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhm9z\" (UniqueName: \"kubernetes.io/projected/0d036633-8176-4398-b267-6dbaf9e9d422-kube-api-access-zhm9z\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.041707 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-catalog-content\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.041848 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-utilities\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.060027 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhm9z\" (UniqueName: \"kubernetes.io/projected/0d036633-8176-4398-b267-6dbaf9e9d422-kube-api-access-zhm9z\") pod \"certified-operators-r8djb\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.235359 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:20 crc kubenswrapper[4799]: I0319 20:31:20.722809 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8djb"] Mar 19 20:31:21 crc kubenswrapper[4799]: I0319 20:31:21.112144 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerStarted","Data":"94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122"} Mar 19 20:31:21 crc kubenswrapper[4799]: I0319 20:31:21.112455 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerStarted","Data":"2af4096c872c2ba75d7b4f04aa2b822ef4f0357b3d96de764fec3e447023e1b7"} Mar 19 20:31:22 crc kubenswrapper[4799]: I0319 20:31:22.123496 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d036633-8176-4398-b267-6dbaf9e9d422" containerID="94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122" exitCode=0 Mar 19 20:31:22 crc kubenswrapper[4799]: I0319 20:31:22.123607 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerDied","Data":"94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122"} Mar 19 20:31:23 crc kubenswrapper[4799]: I0319 20:31:23.134442 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerStarted","Data":"bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca"} Mar 19 20:31:24 crc kubenswrapper[4799]: I0319 20:31:24.148116 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d036633-8176-4398-b267-6dbaf9e9d422" containerID="bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca" exitCode=0 Mar 19 20:31:24 crc kubenswrapper[4799]: I0319 20:31:24.148314 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerDied","Data":"bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca"} Mar 19 20:31:25 crc kubenswrapper[4799]: I0319 20:31:25.160980 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerStarted","Data":"f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a"} Mar 19 20:31:25 crc kubenswrapper[4799]: I0319 20:31:25.186099 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8djb" podStartSLOduration=3.754885775 podStartE2EDuration="6.186082879s" podCreationTimestamp="2026-03-19 20:31:19 +0000 UTC" firstStartedPulling="2026-03-19 20:31:22.126166416 +0000 UTC m=+1559.732119488" lastFinishedPulling="2026-03-19 20:31:24.55736349 +0000 UTC m=+1562.163316592" observedRunningTime="2026-03-19 20:31:25.180816729 +0000 UTC m=+1562.786769841" watchObservedRunningTime="2026-03-19 20:31:25.186082879 +0000 UTC m=+1562.792035951" Mar 19 20:31:28 crc kubenswrapper[4799]: I0319 20:31:28.756334 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:31:28 crc kubenswrapper[4799]: I0319 20:31:28.756760 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:31:28 crc kubenswrapper[4799]: I0319 20:31:28.756820 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:31:28 crc kubenswrapper[4799]: I0319 20:31:28.757913 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09af92723404390c6609268ff888c50debb267c3f634bab822b68cda538b8c1c"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:31:28 crc kubenswrapper[4799]: I0319 20:31:28.757990 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://09af92723404390c6609268ff888c50debb267c3f634bab822b68cda538b8c1c" gracePeriod=600 Mar 19 20:31:29 crc kubenswrapper[4799]: I0319 20:31:29.205863 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="09af92723404390c6609268ff888c50debb267c3f634bab822b68cda538b8c1c" exitCode=0 Mar 19 20:31:29 crc kubenswrapper[4799]: I0319 20:31:29.205941 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"09af92723404390c6609268ff888c50debb267c3f634bab822b68cda538b8c1c"} Mar 19 20:31:29 crc kubenswrapper[4799]: I0319 20:31:29.206310 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac"} Mar 19 20:31:29 crc kubenswrapper[4799]: I0319 20:31:29.206347 4799 scope.go:117] "RemoveContainer" containerID="896459b668fd89c49f3549689a461713efd29689b0eb95fb693ea8252573a827" Mar 19 20:31:30 crc kubenswrapper[4799]: I0319 20:31:30.235676 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:30 crc kubenswrapper[4799]: I0319 20:31:30.236200 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:30 crc kubenswrapper[4799]: I0319 20:31:30.325081 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:31 crc kubenswrapper[4799]: I0319 20:31:31.316119 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:31 crc kubenswrapper[4799]: I0319 20:31:31.384100 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8djb"] Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.255786 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8djb" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="registry-server" containerID="cri-o://f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a" gracePeriod=2 Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.798684 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.886030 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-catalog-content\") pod \"0d036633-8176-4398-b267-6dbaf9e9d422\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.886118 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-utilities\") pod \"0d036633-8176-4398-b267-6dbaf9e9d422\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.886367 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhm9z\" (UniqueName: \"kubernetes.io/projected/0d036633-8176-4398-b267-6dbaf9e9d422-kube-api-access-zhm9z\") pod \"0d036633-8176-4398-b267-6dbaf9e9d422\" (UID: \"0d036633-8176-4398-b267-6dbaf9e9d422\") " Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.887681 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-utilities" (OuterVolumeSpecName: "utilities") pod "0d036633-8176-4398-b267-6dbaf9e9d422" (UID: "0d036633-8176-4398-b267-6dbaf9e9d422"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.893363 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d036633-8176-4398-b267-6dbaf9e9d422-kube-api-access-zhm9z" (OuterVolumeSpecName: "kube-api-access-zhm9z") pod "0d036633-8176-4398-b267-6dbaf9e9d422" (UID: "0d036633-8176-4398-b267-6dbaf9e9d422"). InnerVolumeSpecName "kube-api-access-zhm9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.955454 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d036633-8176-4398-b267-6dbaf9e9d422" (UID: "0d036633-8176-4398-b267-6dbaf9e9d422"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.988473 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhm9z\" (UniqueName: \"kubernetes.io/projected/0d036633-8176-4398-b267-6dbaf9e9d422-kube-api-access-zhm9z\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.988521 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:33 crc kubenswrapper[4799]: I0319 20:31:33.988541 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d036633-8176-4398-b267-6dbaf9e9d422-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.272203 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d036633-8176-4398-b267-6dbaf9e9d422" containerID="f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a" exitCode=0 Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.272245 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerDied","Data":"f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a"} Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.272272 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8djb" event={"ID":"0d036633-8176-4398-b267-6dbaf9e9d422","Type":"ContainerDied","Data":"2af4096c872c2ba75d7b4f04aa2b822ef4f0357b3d96de764fec3e447023e1b7"} Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.272287 4799 scope.go:117] "RemoveContainer" containerID="f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.272477 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8djb" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.309983 4799 scope.go:117] "RemoveContainer" containerID="bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.324143 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8djb"] Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.332400 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8djb"] Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.343848 4799 scope.go:117] "RemoveContainer" containerID="94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.391801 4799 scope.go:117] "RemoveContainer" containerID="f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a" Mar 19 20:31:34 crc kubenswrapper[4799]: E0319 20:31:34.392364 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a\": container with ID starting with f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a not found: ID does not exist" containerID="f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.392411 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a"} err="failed to get container status \"f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a\": rpc error: code = NotFound desc = could not find container \"f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a\": container with ID starting with f67b2e00b5dcf5e9cae07b542a7cecb93c9f893567336fe965f0632c75cd1a1a not found: ID does not exist" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.392430 4799 scope.go:117] "RemoveContainer" containerID="bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca" Mar 19 20:31:34 crc kubenswrapper[4799]: E0319 20:31:34.392718 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca\": container with ID starting with bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca not found: ID does not exist" containerID="bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.392745 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca"} err="failed to get container status \"bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca\": rpc error: code = NotFound desc = could not find container \"bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca\": container with ID starting with bab61d7f200a7cc6866713e2bc6cdb9af2a88ac8bf4d27be81bf4ab84dab87ca not found: ID does not exist" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.392759 4799 scope.go:117] "RemoveContainer" containerID="94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122" Mar 19 20:31:34 crc kubenswrapper[4799]: E0319 20:31:34.393049 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122\": container with ID starting with 94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122 not found: ID does not exist" containerID="94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122" Mar 19 20:31:34 crc kubenswrapper[4799]: I0319 20:31:34.393070 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122"} err="failed to get container status \"94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122\": rpc error: code = NotFound desc = could not find container \"94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122\": container with ID starting with 94aadf5a97c21fc10e9fe73efa6da28c0c21f7734faf6a38fccd5138aeb82122 not found: ID does not exist" Mar 19 20:31:35 crc kubenswrapper[4799]: I0319 20:31:35.138977 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" path="/var/lib/kubelet/pods/0d036633-8176-4398-b267-6dbaf9e9d422/volumes" Mar 19 20:31:57 crc kubenswrapper[4799]: I0319 20:31:57.131985 4799 scope.go:117] "RemoveContainer" containerID="3975bc0e0caaa101f99848db60036ac2d9f82a22215243eb052d2a1bec787e85" Mar 19 20:31:57 crc kubenswrapper[4799]: I0319 20:31:57.173641 4799 scope.go:117] "RemoveContainer" containerID="bf91bb519310d3b782d3965b79aec26003904606f8d83c4cf8761dccff940634" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.159241 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565872-b9qbx"] Mar 19 20:32:00 crc kubenswrapper[4799]: E0319 20:32:00.160324 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="extract-content" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.160342 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="extract-content" Mar 19 20:32:00 crc kubenswrapper[4799]: E0319 20:32:00.160370 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="extract-utilities" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.160409 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="extract-utilities" Mar 19 20:32:00 crc kubenswrapper[4799]: E0319 20:32:00.160432 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="registry-server" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.160444 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="registry-server" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.160748 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d036633-8176-4398-b267-6dbaf9e9d422" containerName="registry-server" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.161642 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.164194 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.164270 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.165481 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.175127 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565872-b9qbx"] Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.275003 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg57c\" (UniqueName: \"kubernetes.io/projected/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293-kube-api-access-zg57c\") pod \"auto-csr-approver-29565872-b9qbx\" (UID: \"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293\") " pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.377371 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg57c\" (UniqueName: \"kubernetes.io/projected/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293-kube-api-access-zg57c\") pod \"auto-csr-approver-29565872-b9qbx\" (UID: \"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293\") " pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.398140 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg57c\" (UniqueName: \"kubernetes.io/projected/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293-kube-api-access-zg57c\") pod \"auto-csr-approver-29565872-b9qbx\" (UID: \"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293\") " pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:00 crc kubenswrapper[4799]: I0319 20:32:00.504259 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:01 crc kubenswrapper[4799]: I0319 20:32:01.002214 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565872-b9qbx"] Mar 19 20:32:01 crc kubenswrapper[4799]: I0319 20:32:01.595145 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" event={"ID":"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293","Type":"ContainerStarted","Data":"bb33aaeb82096143eb4a98f35366aa68c809a647882c7b1fd8a42ac05de54001"} Mar 19 20:32:02 crc kubenswrapper[4799]: I0319 20:32:02.607571 4799 generic.go:334] "Generic (PLEG): container finished" podID="30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293" containerID="b13d06143d7a46250e7c9f42bd9ff09aecd14950703ed9c89945d81034a818d8" exitCode=0 Mar 19 20:32:02 crc kubenswrapper[4799]: I0319 20:32:02.607664 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" event={"ID":"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293","Type":"ContainerDied","Data":"b13d06143d7a46250e7c9f42bd9ff09aecd14950703ed9c89945d81034a818d8"} Mar 19 20:32:03 crc kubenswrapper[4799]: I0319 20:32:03.972632 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:04 crc kubenswrapper[4799]: I0319 20:32:04.114746 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg57c\" (UniqueName: \"kubernetes.io/projected/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293-kube-api-access-zg57c\") pod \"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293\" (UID: \"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293\") " Mar 19 20:32:04 crc kubenswrapper[4799]: I0319 20:32:04.120955 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293-kube-api-access-zg57c" (OuterVolumeSpecName: "kube-api-access-zg57c") pod "30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293" (UID: "30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293"). InnerVolumeSpecName "kube-api-access-zg57c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:32:04 crc kubenswrapper[4799]: I0319 20:32:04.218935 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg57c\" (UniqueName: \"kubernetes.io/projected/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293-kube-api-access-zg57c\") on node \"crc\" DevicePath \"\"" Mar 19 20:32:04 crc kubenswrapper[4799]: I0319 20:32:04.629549 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" event={"ID":"30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293","Type":"ContainerDied","Data":"bb33aaeb82096143eb4a98f35366aa68c809a647882c7b1fd8a42ac05de54001"} Mar 19 20:32:04 crc kubenswrapper[4799]: I0319 20:32:04.629908 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb33aaeb82096143eb4a98f35366aa68c809a647882c7b1fd8a42ac05de54001" Mar 19 20:32:04 crc kubenswrapper[4799]: I0319 20:32:04.629637 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-b9qbx" Mar 19 20:32:05 crc kubenswrapper[4799]: I0319 20:32:05.077590 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-wxfvv"] Mar 19 20:32:05 crc kubenswrapper[4799]: I0319 20:32:05.087791 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-wxfvv"] Mar 19 20:32:05 crc kubenswrapper[4799]: I0319 20:32:05.128995 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="202925b0-e76d-46a3-80aa-0ac2ba9dad11" path="/var/lib/kubelet/pods/202925b0-e76d-46a3-80aa-0ac2ba9dad11/volumes" Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.078771 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-tm2mp"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.092816 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ebac-account-create-update-f6j6j"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.101728 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vk8mz"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.110657 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a833-account-create-update-ps2nk"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.120468 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ebac-account-create-update-f6j6j"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.130551 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-tm2mp"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.139083 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a833-account-create-update-ps2nk"] Mar 19 20:32:38 crc kubenswrapper[4799]: I0319 20:32:38.146955 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vk8mz"] Mar 19 20:32:39 crc kubenswrapper[4799]: I0319 20:32:39.125473 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dcf216f-4e59-4164-98c4-b13a5ee6ac18" path="/var/lib/kubelet/pods/2dcf216f-4e59-4164-98c4-b13a5ee6ac18/volumes" Mar 19 20:32:39 crc kubenswrapper[4799]: I0319 20:32:39.126016 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f16902-ec65-40c0-bb69-0c0ee2d8b2e2" path="/var/lib/kubelet/pods/43f16902-ec65-40c0-bb69-0c0ee2d8b2e2/volumes" Mar 19 20:32:39 crc kubenswrapper[4799]: I0319 20:32:39.126527 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6023c298-3c8c-4d62-9f55-c55bede668e6" path="/var/lib/kubelet/pods/6023c298-3c8c-4d62-9f55-c55bede668e6/volumes" Mar 19 20:32:39 crc kubenswrapper[4799]: I0319 20:32:39.127043 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a73293d-f9d4-42f6-b03d-21a7ebb99fbf" path="/var/lib/kubelet/pods/8a73293d-f9d4-42f6-b03d-21a7ebb99fbf/volumes" Mar 19 20:32:43 crc kubenswrapper[4799]: I0319 20:32:43.123271 4799 generic.go:334] "Generic (PLEG): container finished" podID="8e141b61-4ffc-407f-a554-58f8176b1b18" containerID="fa004322c9b929ba298b5967cea13e3d4e73d676eae634994ef869a13a0f3716" exitCode=0 Mar 19 20:32:43 crc kubenswrapper[4799]: I0319 20:32:43.130605 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" event={"ID":"8e141b61-4ffc-407f-a554-58f8176b1b18","Type":"ContainerDied","Data":"fa004322c9b929ba298b5967cea13e3d4e73d676eae634994ef869a13a0f3716"} Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.611927 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.748032 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-ssh-key-openstack-edpm-ipam\") pod \"8e141b61-4ffc-407f-a554-58f8176b1b18\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.748454 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4chtd\" (UniqueName: \"kubernetes.io/projected/8e141b61-4ffc-407f-a554-58f8176b1b18-kube-api-access-4chtd\") pod \"8e141b61-4ffc-407f-a554-58f8176b1b18\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.748506 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-inventory\") pod \"8e141b61-4ffc-407f-a554-58f8176b1b18\" (UID: \"8e141b61-4ffc-407f-a554-58f8176b1b18\") " Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.754743 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e141b61-4ffc-407f-a554-58f8176b1b18-kube-api-access-4chtd" (OuterVolumeSpecName: "kube-api-access-4chtd") pod "8e141b61-4ffc-407f-a554-58f8176b1b18" (UID: "8e141b61-4ffc-407f-a554-58f8176b1b18"). InnerVolumeSpecName "kube-api-access-4chtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.782723 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-inventory" (OuterVolumeSpecName: "inventory") pod "8e141b61-4ffc-407f-a554-58f8176b1b18" (UID: "8e141b61-4ffc-407f-a554-58f8176b1b18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.785887 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e141b61-4ffc-407f-a554-58f8176b1b18" (UID: "8e141b61-4ffc-407f-a554-58f8176b1b18"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.851626 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4chtd\" (UniqueName: \"kubernetes.io/projected/8e141b61-4ffc-407f-a554-58f8176b1b18-kube-api-access-4chtd\") on node \"crc\" DevicePath \"\"" Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.851685 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:32:44 crc kubenswrapper[4799]: I0319 20:32:44.851704 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e141b61-4ffc-407f-a554-58f8176b1b18-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.152531 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" event={"ID":"8e141b61-4ffc-407f-a554-58f8176b1b18","Type":"ContainerDied","Data":"84ce320ab019464e9ce834dd54b2a23a01417aec1ea26b3aab21cbb69a07301e"} Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.152590 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84ce320ab019464e9ce834dd54b2a23a01417aec1ea26b3aab21cbb69a07301e" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.152604 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vgxll" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.258018 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw"] Mar 19 20:32:45 crc kubenswrapper[4799]: E0319 20:32:45.258468 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293" containerName="oc" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.258491 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293" containerName="oc" Mar 19 20:32:45 crc kubenswrapper[4799]: E0319 20:32:45.258532 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e141b61-4ffc-407f-a554-58f8176b1b18" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.258542 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e141b61-4ffc-407f-a554-58f8176b1b18" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.258741 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e141b61-4ffc-407f-a554-58f8176b1b18" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.258787 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293" containerName="oc" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.259589 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.264408 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.264643 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.264738 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.264650 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.278747 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw"] Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.362243 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.362418 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.362538 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rb99\" (UniqueName: \"kubernetes.io/projected/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-kube-api-access-6rb99\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.465168 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.465305 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.465469 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rb99\" (UniqueName: \"kubernetes.io/projected/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-kube-api-access-6rb99\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.471420 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.472432 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.500108 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rb99\" (UniqueName: \"kubernetes.io/projected/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-kube-api-access-6rb99\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:45 crc kubenswrapper[4799]: I0319 20:32:45.592234 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:32:46 crc kubenswrapper[4799]: I0319 20:32:46.046623 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-60fe-account-create-update-x4bxp"] Mar 19 20:32:46 crc kubenswrapper[4799]: I0319 20:32:46.067035 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4z9kc"] Mar 19 20:32:46 crc kubenswrapper[4799]: I0319 20:32:46.091297 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-60fe-account-create-update-x4bxp"] Mar 19 20:32:46 crc kubenswrapper[4799]: I0319 20:32:46.103472 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4z9kc"] Mar 19 20:32:46 crc kubenswrapper[4799]: I0319 20:32:46.159795 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw"] Mar 19 20:32:47 crc kubenswrapper[4799]: I0319 20:32:47.136726 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872fa1da-9d43-4344-aee7-383a1f418430" path="/var/lib/kubelet/pods/872fa1da-9d43-4344-aee7-383a1f418430/volumes" Mar 19 20:32:47 crc kubenswrapper[4799]: I0319 20:32:47.137954 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c84329-8a13-4319-95a0-313681e0c23d" path="/var/lib/kubelet/pods/a3c84329-8a13-4319-95a0-313681e0c23d/volumes" Mar 19 20:32:47 crc kubenswrapper[4799]: I0319 20:32:47.194163 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" event={"ID":"8886a9f8-8f15-43f4-a721-3a487d2ff6f7","Type":"ContainerStarted","Data":"d21879ef350a418249ecb532c5477917c194d950a6b0a1b4d1bd5ab3b501cdce"} Mar 19 20:32:48 crc kubenswrapper[4799]: I0319 20:32:48.207059 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" event={"ID":"8886a9f8-8f15-43f4-a721-3a487d2ff6f7","Type":"ContainerStarted","Data":"87f0e9a060bc181634e9448b781ceb9c30505ff536ae49f4aadf61bec2aa9d56"} Mar 19 20:32:48 crc kubenswrapper[4799]: I0319 20:32:48.251241 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" podStartSLOduration=2.124978775 podStartE2EDuration="3.251217962s" podCreationTimestamp="2026-03-19 20:32:45 +0000 UTC" firstStartedPulling="2026-03-19 20:32:46.162061344 +0000 UTC m=+1643.768014406" lastFinishedPulling="2026-03-19 20:32:47.288300521 +0000 UTC m=+1644.894253593" observedRunningTime="2026-03-19 20:32:48.227181542 +0000 UTC m=+1645.833134624" watchObservedRunningTime="2026-03-19 20:32:48.251217962 +0000 UTC m=+1645.857171044" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.288521 4799 scope.go:117] "RemoveContainer" containerID="0b8a51b2d969bdc26cb9c7ea02ba8c4b56045e9a8043090cad3c15ff6eb87dc0" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.322535 4799 scope.go:117] "RemoveContainer" containerID="34b4912ed5f0443d0eb92fceb8283110c88557fba92e208e16f44c826abf69c3" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.395105 4799 scope.go:117] "RemoveContainer" containerID="bdf13e8c567059e2e64dfd82bb6b750237a8bea5962dab58029151e3b6148bc1" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.445015 4799 scope.go:117] "RemoveContainer" containerID="51ddad6e8229f881952601504fb79069a156159dca5061b9e3077fca7e080b46" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.479685 4799 scope.go:117] "RemoveContainer" containerID="b95b2d82ab6cd824f0ca6556956b4abcf544403b60629e3b3c60b38beb8af097" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.541221 4799 scope.go:117] "RemoveContainer" containerID="16013e65d880800730d4b216afa69c2977af4778be676c2e7a13f0f45c121de2" Mar 19 20:32:57 crc kubenswrapper[4799]: I0319 20:32:57.574080 4799 scope.go:117] "RemoveContainer" containerID="b087ccf55fd7c853a2ceaed6b641d9b27e0dd1af2c6d35efa658aed78f635e7b" Mar 19 20:33:01 crc kubenswrapper[4799]: I0319 20:33:01.057072 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6585v"] Mar 19 20:33:01 crc kubenswrapper[4799]: I0319 20:33:01.071895 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6585v"] Mar 19 20:33:01 crc kubenswrapper[4799]: I0319 20:33:01.127434 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c58bd42-1add-4893-96c1-bb363f7e2297" path="/var/lib/kubelet/pods/3c58bd42-1add-4893-96c1-bb363f7e2297/volumes" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.272550 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-md4pl"] Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.276074 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.297982 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-md4pl"] Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.430294 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-catalog-content\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.430541 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dw67\" (UniqueName: \"kubernetes.io/projected/2883bc15-124e-4f88-bbcf-305af7ef8218-kube-api-access-2dw67\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.431085 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-utilities\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.532599 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-utilities\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.532697 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-catalog-content\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.532723 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dw67\" (UniqueName: \"kubernetes.io/projected/2883bc15-124e-4f88-bbcf-305af7ef8218-kube-api-access-2dw67\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.533055 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-utilities\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.533242 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-catalog-content\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.551989 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dw67\" (UniqueName: \"kubernetes.io/projected/2883bc15-124e-4f88-bbcf-305af7ef8218-kube-api-access-2dw67\") pod \"redhat-marketplace-md4pl\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:08 crc kubenswrapper[4799]: I0319 20:33:08.608611 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.039056 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ndlfj"] Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.047711 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ndlfj"] Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.096703 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-md4pl"] Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.160291 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9a5a1a-cacb-4ae7-a087-e50045584210" path="/var/lib/kubelet/pods/ab9a5a1a-cacb-4ae7-a087-e50045584210/volumes" Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.427490 4799 generic.go:334] "Generic (PLEG): container finished" podID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerID="c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5" exitCode=0 Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.427540 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerDied","Data":"c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5"} Mar 19 20:33:09 crc kubenswrapper[4799]: I0319 20:33:09.427592 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerStarted","Data":"2784d86f6830596c40d97c4a0fc1b1bb5f30a5a86974b9f82a02af46f792db1d"} Mar 19 20:33:10 crc kubenswrapper[4799]: I0319 20:33:10.436797 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerStarted","Data":"6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a"} Mar 19 20:33:11 crc kubenswrapper[4799]: I0319 20:33:11.450255 4799 generic.go:334] "Generic (PLEG): container finished" podID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerID="6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a" exitCode=0 Mar 19 20:33:11 crc kubenswrapper[4799]: I0319 20:33:11.450304 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerDied","Data":"6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a"} Mar 19 20:33:12 crc kubenswrapper[4799]: I0319 20:33:12.464629 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerStarted","Data":"da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84"} Mar 19 20:33:12 crc kubenswrapper[4799]: I0319 20:33:12.484176 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-md4pl" podStartSLOduration=2.010331022 podStartE2EDuration="4.484156636s" podCreationTimestamp="2026-03-19 20:33:08 +0000 UTC" firstStartedPulling="2026-03-19 20:33:09.42925351 +0000 UTC m=+1667.035206582" lastFinishedPulling="2026-03-19 20:33:11.903079114 +0000 UTC m=+1669.509032196" observedRunningTime="2026-03-19 20:33:12.480242319 +0000 UTC m=+1670.086195391" watchObservedRunningTime="2026-03-19 20:33:12.484156636 +0000 UTC m=+1670.090109708" Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.046296 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-gk65m"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.063503 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8945-account-create-update-kldff"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.076565 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h8b7k"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.086024 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5219-account-create-update-bp54l"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.093936 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ncd8j"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.101887 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8945-account-create-update-kldff"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.113303 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5219-account-create-update-bp54l"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.121816 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ncd8j"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.128617 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h8b7k"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.135849 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-gk65m"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.142828 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-42a2-account-create-update-c2ghz"] Mar 19 20:33:16 crc kubenswrapper[4799]: I0319 20:33:16.150006 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-42a2-account-create-update-c2ghz"] Mar 19 20:33:17 crc kubenswrapper[4799]: I0319 20:33:17.132182 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee26f4f-790a-479f-b0b0-23c17a5aa642" path="/var/lib/kubelet/pods/2ee26f4f-790a-479f-b0b0-23c17a5aa642/volumes" Mar 19 20:33:17 crc kubenswrapper[4799]: I0319 20:33:17.132950 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3426fb78-4e09-4ddf-936d-48509c174c3f" path="/var/lib/kubelet/pods/3426fb78-4e09-4ddf-936d-48509c174c3f/volumes" Mar 19 20:33:17 crc kubenswrapper[4799]: I0319 20:33:17.133527 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900c3adf-1008-43dd-a517-eae371754fcd" path="/var/lib/kubelet/pods/900c3adf-1008-43dd-a517-eae371754fcd/volumes" Mar 19 20:33:17 crc kubenswrapper[4799]: I0319 20:33:17.134115 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9661e457-f424-4642-8551-c61fc2924ae9" path="/var/lib/kubelet/pods/9661e457-f424-4642-8551-c61fc2924ae9/volumes" Mar 19 20:33:17 crc kubenswrapper[4799]: I0319 20:33:17.135151 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e9a78e-8ff2-4edb-9ecc-35d304090da4" path="/var/lib/kubelet/pods/b2e9a78e-8ff2-4edb-9ecc-35d304090da4/volumes" Mar 19 20:33:17 crc kubenswrapper[4799]: I0319 20:33:17.135760 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ae96d4-a792-41df-85f7-3cf044fa8e0c" path="/var/lib/kubelet/pods/d5ae96d4-a792-41df-85f7-3cf044fa8e0c/volumes" Mar 19 20:33:18 crc kubenswrapper[4799]: I0319 20:33:18.609499 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:18 crc kubenswrapper[4799]: I0319 20:33:18.609888 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:18 crc kubenswrapper[4799]: I0319 20:33:18.665122 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:19 crc kubenswrapper[4799]: I0319 20:33:19.634915 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:19 crc kubenswrapper[4799]: I0319 20:33:19.698853 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-md4pl"] Mar 19 20:33:21 crc kubenswrapper[4799]: I0319 20:33:21.580318 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-md4pl" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="registry-server" containerID="cri-o://da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84" gracePeriod=2 Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.194469 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.358282 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dw67\" (UniqueName: \"kubernetes.io/projected/2883bc15-124e-4f88-bbcf-305af7ef8218-kube-api-access-2dw67\") pod \"2883bc15-124e-4f88-bbcf-305af7ef8218\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.359028 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-catalog-content\") pod \"2883bc15-124e-4f88-bbcf-305af7ef8218\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.359209 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-utilities\") pod \"2883bc15-124e-4f88-bbcf-305af7ef8218\" (UID: \"2883bc15-124e-4f88-bbcf-305af7ef8218\") " Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.360125 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-utilities" (OuterVolumeSpecName: "utilities") pod "2883bc15-124e-4f88-bbcf-305af7ef8218" (UID: "2883bc15-124e-4f88-bbcf-305af7ef8218"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.365148 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2883bc15-124e-4f88-bbcf-305af7ef8218-kube-api-access-2dw67" (OuterVolumeSpecName: "kube-api-access-2dw67") pod "2883bc15-124e-4f88-bbcf-305af7ef8218" (UID: "2883bc15-124e-4f88-bbcf-305af7ef8218"). InnerVolumeSpecName "kube-api-access-2dw67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.409745 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2883bc15-124e-4f88-bbcf-305af7ef8218" (UID: "2883bc15-124e-4f88-bbcf-305af7ef8218"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.461608 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dw67\" (UniqueName: \"kubernetes.io/projected/2883bc15-124e-4f88-bbcf-305af7ef8218-kube-api-access-2dw67\") on node \"crc\" DevicePath \"\"" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.461647 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.461660 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2883bc15-124e-4f88-bbcf-305af7ef8218-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.591991 4799 generic.go:334] "Generic (PLEG): container finished" podID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerID="da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84" exitCode=0 Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.592054 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerDied","Data":"da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84"} Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.592126 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-md4pl" event={"ID":"2883bc15-124e-4f88-bbcf-305af7ef8218","Type":"ContainerDied","Data":"2784d86f6830596c40d97c4a0fc1b1bb5f30a5a86974b9f82a02af46f792db1d"} Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.592129 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-md4pl" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.592157 4799 scope.go:117] "RemoveContainer" containerID="da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.621022 4799 scope.go:117] "RemoveContainer" containerID="6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.657930 4799 scope.go:117] "RemoveContainer" containerID="c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.662317 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-md4pl"] Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.677969 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-md4pl"] Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.720197 4799 scope.go:117] "RemoveContainer" containerID="da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84" Mar 19 20:33:22 crc kubenswrapper[4799]: E0319 20:33:22.721111 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84\": container with ID starting with da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84 not found: ID does not exist" containerID="da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.721195 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84"} err="failed to get container status \"da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84\": rpc error: code = NotFound desc = could not find container \"da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84\": container with ID starting with da2fb51f17a6ffbd30fc64e1f1d13db8efe7fbb715264dd429516f571ffa4f84 not found: ID does not exist" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.721252 4799 scope.go:117] "RemoveContainer" containerID="6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a" Mar 19 20:33:22 crc kubenswrapper[4799]: E0319 20:33:22.721790 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a\": container with ID starting with 6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a not found: ID does not exist" containerID="6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.721853 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a"} err="failed to get container status \"6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a\": rpc error: code = NotFound desc = could not find container \"6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a\": container with ID starting with 6609da15ecb2cc295d47fa72353508a2c635ee939f9a066e11aa6a974a1d405a not found: ID does not exist" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.721910 4799 scope.go:117] "RemoveContainer" containerID="c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5" Mar 19 20:33:22 crc kubenswrapper[4799]: E0319 20:33:22.722367 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5\": container with ID starting with c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5 not found: ID does not exist" containerID="c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5" Mar 19 20:33:22 crc kubenswrapper[4799]: I0319 20:33:22.722468 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5"} err="failed to get container status \"c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5\": rpc error: code = NotFound desc = could not find container \"c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5\": container with ID starting with c7c908b712618afcb4ed9cf3519b08dfac0386f06f7e805bb3603600b1bd0ea5 not found: ID does not exist" Mar 19 20:33:23 crc kubenswrapper[4799]: I0319 20:33:23.133269 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" path="/var/lib/kubelet/pods/2883bc15-124e-4f88-bbcf-305af7ef8218/volumes" Mar 19 20:33:24 crc kubenswrapper[4799]: I0319 20:33:24.047870 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4492r"] Mar 19 20:33:24 crc kubenswrapper[4799]: I0319 20:33:24.058142 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4492r"] Mar 19 20:33:25 crc kubenswrapper[4799]: I0319 20:33:25.128828 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c863d9fd-be20-4dfd-992e-02af944c3382" path="/var/lib/kubelet/pods/c863d9fd-be20-4dfd-992e-02af944c3382/volumes" Mar 19 20:33:53 crc kubenswrapper[4799]: I0319 20:33:53.996237 4799 generic.go:334] "Generic (PLEG): container finished" podID="8886a9f8-8f15-43f4-a721-3a487d2ff6f7" containerID="87f0e9a060bc181634e9448b781ceb9c30505ff536ae49f4aadf61bec2aa9d56" exitCode=0 Mar 19 20:33:53 crc kubenswrapper[4799]: I0319 20:33:53.996422 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" event={"ID":"8886a9f8-8f15-43f4-a721-3a487d2ff6f7","Type":"ContainerDied","Data":"87f0e9a060bc181634e9448b781ceb9c30505ff536ae49f4aadf61bec2aa9d56"} Mar 19 20:33:54 crc kubenswrapper[4799]: I0319 20:33:54.061502 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mdm7f"] Mar 19 20:33:54 crc kubenswrapper[4799]: I0319 20:33:54.072998 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mdm7f"] Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.143715 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901bb6a6-df70-44d8-a3e7-b8de5d4b51d6" path="/var/lib/kubelet/pods/901bb6a6-df70-44d8-a3e7-b8de5d4b51d6/volumes" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.524892 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.682598 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-ssh-key-openstack-edpm-ipam\") pod \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.682721 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-inventory\") pod \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.682866 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rb99\" (UniqueName: \"kubernetes.io/projected/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-kube-api-access-6rb99\") pod \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\" (UID: \"8886a9f8-8f15-43f4-a721-3a487d2ff6f7\") " Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.687834 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-kube-api-access-6rb99" (OuterVolumeSpecName: "kube-api-access-6rb99") pod "8886a9f8-8f15-43f4-a721-3a487d2ff6f7" (UID: "8886a9f8-8f15-43f4-a721-3a487d2ff6f7"). InnerVolumeSpecName "kube-api-access-6rb99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.709147 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8886a9f8-8f15-43f4-a721-3a487d2ff6f7" (UID: "8886a9f8-8f15-43f4-a721-3a487d2ff6f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.714324 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-inventory" (OuterVolumeSpecName: "inventory") pod "8886a9f8-8f15-43f4-a721-3a487d2ff6f7" (UID: "8886a9f8-8f15-43f4-a721-3a487d2ff6f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.785445 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rb99\" (UniqueName: \"kubernetes.io/projected/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-kube-api-access-6rb99\") on node \"crc\" DevicePath \"\"" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.785746 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:33:55 crc kubenswrapper[4799]: I0319 20:33:55.785757 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8886a9f8-8f15-43f4-a721-3a487d2ff6f7-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.020590 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" event={"ID":"8886a9f8-8f15-43f4-a721-3a487d2ff6f7","Type":"ContainerDied","Data":"d21879ef350a418249ecb532c5477917c194d950a6b0a1b4d1bd5ab3b501cdce"} Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.020646 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d21879ef350a418249ecb532c5477917c194d950a6b0a1b4d1bd5ab3b501cdce" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.020678 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.110843 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz"] Mar 19 20:33:56 crc kubenswrapper[4799]: E0319 20:33:56.111422 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8886a9f8-8f15-43f4-a721-3a487d2ff6f7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.111444 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8886a9f8-8f15-43f4-a721-3a487d2ff6f7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 20:33:56 crc kubenswrapper[4799]: E0319 20:33:56.111463 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="extract-content" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.111469 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="extract-content" Mar 19 20:33:56 crc kubenswrapper[4799]: E0319 20:33:56.111490 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="registry-server" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.111501 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="registry-server" Mar 19 20:33:56 crc kubenswrapper[4799]: E0319 20:33:56.111516 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="extract-utilities" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.111523 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="extract-utilities" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.111722 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8886a9f8-8f15-43f4-a721-3a487d2ff6f7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.111741 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2883bc15-124e-4f88-bbcf-305af7ef8218" containerName="registry-server" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.112358 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.115309 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.115310 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.115476 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.121135 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.130314 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz"] Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.296038 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm4bv\" (UniqueName: \"kubernetes.io/projected/4bb02004-3780-40e2-9e05-93d1791c0c16-kube-api-access-zm4bv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.297202 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.298074 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.399711 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.399872 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm4bv\" (UniqueName: \"kubernetes.io/projected/4bb02004-3780-40e2-9e05-93d1791c0c16-kube-api-access-zm4bv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.399980 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.406926 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.408635 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.422303 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm4bv\" (UniqueName: \"kubernetes.io/projected/4bb02004-3780-40e2-9e05-93d1791c0c16-kube-api-access-zm4bv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:56 crc kubenswrapper[4799]: I0319 20:33:56.437783 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.111796 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz"] Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.723679 4799 scope.go:117] "RemoveContainer" containerID="6d8d8eeb0d0e78e213af3c072a714d7de02ce7dadd70e999f11e4e93c4595d34" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.772872 4799 scope.go:117] "RemoveContainer" containerID="a9b3037206f8eef45758ab3c05a920ffce064f2e9464418e831fadf937e81d39" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.834782 4799 scope.go:117] "RemoveContainer" containerID="1ee26ff8168a50a5916e7c030c24c6f46cd670d9cd06949e93a8ccec96ddd5da" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.913800 4799 scope.go:117] "RemoveContainer" containerID="1f4655b9587148fdd509f653e27f24523ab1c2e1c8436e661eee2bdf6afa7405" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.942324 4799 scope.go:117] "RemoveContainer" containerID="29b3738d832bb9ea44f3bcbcd04611e7a98e5750bc14d9f536bae35fb20335a5" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.964631 4799 scope.go:117] "RemoveContainer" containerID="c57a7999935067c6e701e3fbce224781768447df5a40b3cd764d75de4e7f161d" Mar 19 20:33:57 crc kubenswrapper[4799]: I0319 20:33:57.985340 4799 scope.go:117] "RemoveContainer" containerID="42d8edaaeccfdd85602d0f86893e604fa36f83f851a07adf8de1cc989c54d20b" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.008360 4799 scope.go:117] "RemoveContainer" containerID="d9a73fe93c5da4a1b0141d12844c0ea46ffb7817596288a956a936e5b578eb9f" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.051017 4799 scope.go:117] "RemoveContainer" containerID="6df65e7b4857d3d91b74ceb2058f5fc0447fb8fa7e4dd4ccc77e917c2932b070" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.054554 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" event={"ID":"4bb02004-3780-40e2-9e05-93d1791c0c16","Type":"ContainerStarted","Data":"0512e46a8ccf2bb27c4ae1b591e0ef4c92c1a0285cf56df54ca9de45e52f7b8f"} Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.054599 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" event={"ID":"4bb02004-3780-40e2-9e05-93d1791c0c16","Type":"ContainerStarted","Data":"bb07428deb3cdbb9584a54d88776c3879dc5c6c35ab0d1b9fe197624241e7192"} Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.080650 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" podStartSLOduration=1.608731552 podStartE2EDuration="2.080619235s" podCreationTimestamp="2026-03-19 20:33:56 +0000 UTC" firstStartedPulling="2026-03-19 20:33:57.125728664 +0000 UTC m=+1714.731681746" lastFinishedPulling="2026-03-19 20:33:57.597616317 +0000 UTC m=+1715.203569429" observedRunningTime="2026-03-19 20:33:58.077109388 +0000 UTC m=+1715.683062470" watchObservedRunningTime="2026-03-19 20:33:58.080619235 +0000 UTC m=+1715.686572337" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.084337 4799 scope.go:117] "RemoveContainer" containerID="2e4500b604fe806cfd64ffc9c29b6f158bb4074f139e879b21f76b4553692917" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.105489 4799 scope.go:117] "RemoveContainer" containerID="b7b2b9aa30e71a3e51d52b973fb29cec97edd2922a6c017971c193609d030ad7" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.129130 4799 scope.go:117] "RemoveContainer" containerID="091c67f71c1e7ef5682647a89e6fa63babd29ed43d3973669e4568cb50bf4285" Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.756256 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:33:58 crc kubenswrapper[4799]: I0319 20:33:58.756708 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.133612 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565874-v455m"] Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.135633 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.140426 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.140448 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.141241 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.153077 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565874-v455m"] Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.279858 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hw74\" (UniqueName: \"kubernetes.io/projected/2ab17f72-6891-4733-a992-a4c84ff34b3e-kube-api-access-8hw74\") pod \"auto-csr-approver-29565874-v455m\" (UID: \"2ab17f72-6891-4733-a992-a4c84ff34b3e\") " pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.382046 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hw74\" (UniqueName: \"kubernetes.io/projected/2ab17f72-6891-4733-a992-a4c84ff34b3e-kube-api-access-8hw74\") pod \"auto-csr-approver-29565874-v455m\" (UID: \"2ab17f72-6891-4733-a992-a4c84ff34b3e\") " pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.414034 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hw74\" (UniqueName: \"kubernetes.io/projected/2ab17f72-6891-4733-a992-a4c84ff34b3e-kube-api-access-8hw74\") pod \"auto-csr-approver-29565874-v455m\" (UID: \"2ab17f72-6891-4733-a992-a4c84ff34b3e\") " pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.467318 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:00 crc kubenswrapper[4799]: I0319 20:34:00.975641 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565874-v455m"] Mar 19 20:34:00 crc kubenswrapper[4799]: W0319 20:34:00.983316 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab17f72_6891_4733_a992_a4c84ff34b3e.slice/crio-976986cea26241c37a69c2981b6c3d847b9c2d77469be489c755e8ee54db3170 WatchSource:0}: Error finding container 976986cea26241c37a69c2981b6c3d847b9c2d77469be489c755e8ee54db3170: Status 404 returned error can't find the container with id 976986cea26241c37a69c2981b6c3d847b9c2d77469be489c755e8ee54db3170 Mar 19 20:34:01 crc kubenswrapper[4799]: I0319 20:34:01.046132 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-s6gmj"] Mar 19 20:34:01 crc kubenswrapper[4799]: I0319 20:34:01.060246 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-s6gmj"] Mar 19 20:34:01 crc kubenswrapper[4799]: I0319 20:34:01.090497 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565874-v455m" event={"ID":"2ab17f72-6891-4733-a992-a4c84ff34b3e","Type":"ContainerStarted","Data":"976986cea26241c37a69c2981b6c3d847b9c2d77469be489c755e8ee54db3170"} Mar 19 20:34:01 crc kubenswrapper[4799]: I0319 20:34:01.144156 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e1253c-8230-46c8-b0b7-3b34a42ae0bd" path="/var/lib/kubelet/pods/e0e1253c-8230-46c8-b0b7-3b34a42ae0bd/volumes" Mar 19 20:34:02 crc kubenswrapper[4799]: I0319 20:34:02.051955 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8c9qh"] Mar 19 20:34:02 crc kubenswrapper[4799]: I0319 20:34:02.070242 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8c9qh"] Mar 19 20:34:03 crc kubenswrapper[4799]: I0319 20:34:03.119102 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ab17f72-6891-4733-a992-a4c84ff34b3e" containerID="078f9d022d16dd421af99a06229577e775f9009630de900152c89a985383b337" exitCode=0 Mar 19 20:34:03 crc kubenswrapper[4799]: I0319 20:34:03.122742 4799 generic.go:334] "Generic (PLEG): container finished" podID="4bb02004-3780-40e2-9e05-93d1791c0c16" containerID="0512e46a8ccf2bb27c4ae1b591e0ef4c92c1a0285cf56df54ca9de45e52f7b8f" exitCode=0 Mar 19 20:34:03 crc kubenswrapper[4799]: I0319 20:34:03.136549 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b23ff57a-7365-4543-8a58-b9df4a3e52f4" path="/var/lib/kubelet/pods/b23ff57a-7365-4543-8a58-b9df4a3e52f4/volumes" Mar 19 20:34:03 crc kubenswrapper[4799]: I0319 20:34:03.137369 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565874-v455m" event={"ID":"2ab17f72-6891-4733-a992-a4c84ff34b3e","Type":"ContainerDied","Data":"078f9d022d16dd421af99a06229577e775f9009630de900152c89a985383b337"} Mar 19 20:34:03 crc kubenswrapper[4799]: I0319 20:34:03.137437 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" event={"ID":"4bb02004-3780-40e2-9e05-93d1791c0c16","Type":"ContainerDied","Data":"0512e46a8ccf2bb27c4ae1b591e0ef4c92c1a0285cf56df54ca9de45e52f7b8f"} Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.507887 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.607031 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.667462 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-inventory\") pod \"4bb02004-3780-40e2-9e05-93d1791c0c16\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.667516 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hw74\" (UniqueName: \"kubernetes.io/projected/2ab17f72-6891-4733-a992-a4c84ff34b3e-kube-api-access-8hw74\") pod \"2ab17f72-6891-4733-a992-a4c84ff34b3e\" (UID: \"2ab17f72-6891-4733-a992-a4c84ff34b3e\") " Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.667599 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm4bv\" (UniqueName: \"kubernetes.io/projected/4bb02004-3780-40e2-9e05-93d1791c0c16-kube-api-access-zm4bv\") pod \"4bb02004-3780-40e2-9e05-93d1791c0c16\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.667773 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-ssh-key-openstack-edpm-ipam\") pod \"4bb02004-3780-40e2-9e05-93d1791c0c16\" (UID: \"4bb02004-3780-40e2-9e05-93d1791c0c16\") " Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.673074 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab17f72-6891-4733-a992-a4c84ff34b3e-kube-api-access-8hw74" (OuterVolumeSpecName: "kube-api-access-8hw74") pod "2ab17f72-6891-4733-a992-a4c84ff34b3e" (UID: "2ab17f72-6891-4733-a992-a4c84ff34b3e"). InnerVolumeSpecName "kube-api-access-8hw74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.674935 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb02004-3780-40e2-9e05-93d1791c0c16-kube-api-access-zm4bv" (OuterVolumeSpecName: "kube-api-access-zm4bv") pod "4bb02004-3780-40e2-9e05-93d1791c0c16" (UID: "4bb02004-3780-40e2-9e05-93d1791c0c16"). InnerVolumeSpecName "kube-api-access-zm4bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.700005 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4bb02004-3780-40e2-9e05-93d1791c0c16" (UID: "4bb02004-3780-40e2-9e05-93d1791c0c16"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.701929 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-inventory" (OuterVolumeSpecName: "inventory") pod "4bb02004-3780-40e2-9e05-93d1791c0c16" (UID: "4bb02004-3780-40e2-9e05-93d1791c0c16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.771129 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.771171 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hw74\" (UniqueName: \"kubernetes.io/projected/2ab17f72-6891-4733-a992-a4c84ff34b3e-kube-api-access-8hw74\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.771188 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm4bv\" (UniqueName: \"kubernetes.io/projected/4bb02004-3780-40e2-9e05-93d1791c0c16-kube-api-access-zm4bv\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:04 crc kubenswrapper[4799]: I0319 20:34:04.771202 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4bb02004-3780-40e2-9e05-93d1791c0c16-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.155515 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" event={"ID":"4bb02004-3780-40e2-9e05-93d1791c0c16","Type":"ContainerDied","Data":"bb07428deb3cdbb9584a54d88776c3879dc5c6c35ab0d1b9fe197624241e7192"} Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.155592 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb07428deb3cdbb9584a54d88776c3879dc5c6c35ab0d1b9fe197624241e7192" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.155702 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.158274 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565874-v455m" event={"ID":"2ab17f72-6891-4733-a992-a4c84ff34b3e","Type":"ContainerDied","Data":"976986cea26241c37a69c2981b6c3d847b9c2d77469be489c755e8ee54db3170"} Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.158353 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565874-v455m" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.158361 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976986cea26241c37a69c2981b6c3d847b9c2d77469be489c755e8ee54db3170" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.274898 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8"] Mar 19 20:34:05 crc kubenswrapper[4799]: E0319 20:34:05.275493 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab17f72-6891-4733-a992-a4c84ff34b3e" containerName="oc" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.275521 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab17f72-6891-4733-a992-a4c84ff34b3e" containerName="oc" Mar 19 20:34:05 crc kubenswrapper[4799]: E0319 20:34:05.275544 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb02004-3780-40e2-9e05-93d1791c0c16" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.275556 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb02004-3780-40e2-9e05-93d1791c0c16" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.275830 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb02004-3780-40e2-9e05-93d1791c0c16" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.275869 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab17f72-6891-4733-a992-a4c84ff34b3e" containerName="oc" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.276580 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.282563 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.282591 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.282833 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.283615 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.300685 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8"] Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.383822 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.383862 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsnt2\" (UniqueName: \"kubernetes.io/projected/416c30d9-5442-418f-a668-fcca8c4804a2-kube-api-access-gsnt2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.383897 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.485158 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsnt2\" (UniqueName: \"kubernetes.io/projected/416c30d9-5442-418f-a668-fcca8c4804a2-kube-api-access-gsnt2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.485231 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.485448 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.490934 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.491916 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.518775 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsnt2\" (UniqueName: \"kubernetes.io/projected/416c30d9-5442-418f-a668-fcca8c4804a2-kube-api-access-gsnt2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4wbn8\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.594269 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-bmldh"] Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.610131 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-bmldh"] Mar 19 20:34:05 crc kubenswrapper[4799]: I0319 20:34:05.610608 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:06 crc kubenswrapper[4799]: W0319 20:34:06.167035 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod416c30d9_5442_418f_a668_fcca8c4804a2.slice/crio-e93c544dcf9931e2ede4c977403701cf390d057b071ce6bce3268fbb6441ee52 WatchSource:0}: Error finding container e93c544dcf9931e2ede4c977403701cf390d057b071ce6bce3268fbb6441ee52: Status 404 returned error can't find the container with id e93c544dcf9931e2ede4c977403701cf390d057b071ce6bce3268fbb6441ee52 Mar 19 20:34:06 crc kubenswrapper[4799]: I0319 20:34:06.170911 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8"] Mar 19 20:34:07 crc kubenswrapper[4799]: I0319 20:34:07.129345 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02f4e9ac-a328-489e-976e-79fc3642d88f" path="/var/lib/kubelet/pods/02f4e9ac-a328-489e-976e-79fc3642d88f/volumes" Mar 19 20:34:07 crc kubenswrapper[4799]: I0319 20:34:07.179271 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" event={"ID":"416c30d9-5442-418f-a668-fcca8c4804a2","Type":"ContainerStarted","Data":"2ff7a727c7be14bccc3b0f2ba7e93b398fc77fd4db39b9f94c96e80be6a90bbb"} Mar 19 20:34:07 crc kubenswrapper[4799]: I0319 20:34:07.179331 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" event={"ID":"416c30d9-5442-418f-a668-fcca8c4804a2","Type":"ContainerStarted","Data":"e93c544dcf9931e2ede4c977403701cf390d057b071ce6bce3268fbb6441ee52"} Mar 19 20:34:07 crc kubenswrapper[4799]: I0319 20:34:07.201001 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" podStartSLOduration=1.669408374 podStartE2EDuration="2.200959536s" podCreationTimestamp="2026-03-19 20:34:05 +0000 UTC" firstStartedPulling="2026-03-19 20:34:06.17053732 +0000 UTC m=+1723.776490392" lastFinishedPulling="2026-03-19 20:34:06.702088442 +0000 UTC m=+1724.308041554" observedRunningTime="2026-03-19 20:34:07.194060146 +0000 UTC m=+1724.800013238" watchObservedRunningTime="2026-03-19 20:34:07.200959536 +0000 UTC m=+1724.806912628" Mar 19 20:34:17 crc kubenswrapper[4799]: I0319 20:34:17.057693 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jnnwh"] Mar 19 20:34:17 crc kubenswrapper[4799]: I0319 20:34:17.076915 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rcxnv"] Mar 19 20:34:17 crc kubenswrapper[4799]: I0319 20:34:17.087672 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rcxnv"] Mar 19 20:34:17 crc kubenswrapper[4799]: I0319 20:34:17.098540 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jnnwh"] Mar 19 20:34:17 crc kubenswrapper[4799]: I0319 20:34:17.128735 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a34e40-cb35-4589-9fcd-20130bd7831f" path="/var/lib/kubelet/pods/51a34e40-cb35-4589-9fcd-20130bd7831f/volumes" Mar 19 20:34:17 crc kubenswrapper[4799]: I0319 20:34:17.129722 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a40c3083-66e1-4fb7-b4b4-88fa2935cb49" path="/var/lib/kubelet/pods/a40c3083-66e1-4fb7-b4b4-88fa2935cb49/volumes" Mar 19 20:34:28 crc kubenswrapper[4799]: I0319 20:34:28.755934 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:34:28 crc kubenswrapper[4799]: I0319 20:34:28.756794 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:34:42 crc kubenswrapper[4799]: I0319 20:34:42.606871 4799 generic.go:334] "Generic (PLEG): container finished" podID="416c30d9-5442-418f-a668-fcca8c4804a2" containerID="2ff7a727c7be14bccc3b0f2ba7e93b398fc77fd4db39b9f94c96e80be6a90bbb" exitCode=0 Mar 19 20:34:42 crc kubenswrapper[4799]: I0319 20:34:42.607001 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" event={"ID":"416c30d9-5442-418f-a668-fcca8c4804a2","Type":"ContainerDied","Data":"2ff7a727c7be14bccc3b0f2ba7e93b398fc77fd4db39b9f94c96e80be6a90bbb"} Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.051086 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.185661 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsnt2\" (UniqueName: \"kubernetes.io/projected/416c30d9-5442-418f-a668-fcca8c4804a2-kube-api-access-gsnt2\") pod \"416c30d9-5442-418f-a668-fcca8c4804a2\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.185717 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-inventory\") pod \"416c30d9-5442-418f-a668-fcca8c4804a2\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.185771 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-ssh-key-openstack-edpm-ipam\") pod \"416c30d9-5442-418f-a668-fcca8c4804a2\" (UID: \"416c30d9-5442-418f-a668-fcca8c4804a2\") " Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.200989 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416c30d9-5442-418f-a668-fcca8c4804a2-kube-api-access-gsnt2" (OuterVolumeSpecName: "kube-api-access-gsnt2") pod "416c30d9-5442-418f-a668-fcca8c4804a2" (UID: "416c30d9-5442-418f-a668-fcca8c4804a2"). InnerVolumeSpecName "kube-api-access-gsnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.216829 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-inventory" (OuterVolumeSpecName: "inventory") pod "416c30d9-5442-418f-a668-fcca8c4804a2" (UID: "416c30d9-5442-418f-a668-fcca8c4804a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.218293 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "416c30d9-5442-418f-a668-fcca8c4804a2" (UID: "416c30d9-5442-418f-a668-fcca8c4804a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.290594 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsnt2\" (UniqueName: \"kubernetes.io/projected/416c30d9-5442-418f-a668-fcca8c4804a2-kube-api-access-gsnt2\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.290923 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.290959 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/416c30d9-5442-418f-a668-fcca8c4804a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.635534 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" event={"ID":"416c30d9-5442-418f-a668-fcca8c4804a2","Type":"ContainerDied","Data":"e93c544dcf9931e2ede4c977403701cf390d057b071ce6bce3268fbb6441ee52"} Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.635607 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e93c544dcf9931e2ede4c977403701cf390d057b071ce6bce3268fbb6441ee52" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.636088 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4wbn8" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.743702 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh"] Mar 19 20:34:44 crc kubenswrapper[4799]: E0319 20:34:44.744144 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416c30d9-5442-418f-a668-fcca8c4804a2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.744168 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="416c30d9-5442-418f-a668-fcca8c4804a2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.744723 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="416c30d9-5442-418f-a668-fcca8c4804a2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.745343 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.748047 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.748505 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.748660 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.751215 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.766879 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh"] Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.904960 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.905045 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:44 crc kubenswrapper[4799]: I0319 20:34:44.905286 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mscc8\" (UniqueName: \"kubernetes.io/projected/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-kube-api-access-mscc8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.007915 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mscc8\" (UniqueName: \"kubernetes.io/projected/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-kube-api-access-mscc8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.008128 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.008219 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.014171 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.018912 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.041638 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mscc8\" (UniqueName: \"kubernetes.io/projected/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-kube-api-access-mscc8\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.083801 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:34:45 crc kubenswrapper[4799]: I0319 20:34:45.686765 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh"] Mar 19 20:34:45 crc kubenswrapper[4799]: W0319 20:34:45.693124 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode06b9a2d_57e1_4c4d_a08e_d0c2c343130d.slice/crio-d0dd710681591e9127c43e0c0680d14d86514d943e55e64dcdfbf8820003ea73 WatchSource:0}: Error finding container d0dd710681591e9127c43e0c0680d14d86514d943e55e64dcdfbf8820003ea73: Status 404 returned error can't find the container with id d0dd710681591e9127c43e0c0680d14d86514d943e55e64dcdfbf8820003ea73 Mar 19 20:34:46 crc kubenswrapper[4799]: I0319 20:34:46.655895 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" event={"ID":"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d","Type":"ContainerStarted","Data":"b3c81d0737a1566a94d5788435fc02a8e8cdaf99ee37a905a7e98a757b746637"} Mar 19 20:34:46 crc kubenswrapper[4799]: I0319 20:34:46.656466 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" event={"ID":"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d","Type":"ContainerStarted","Data":"d0dd710681591e9127c43e0c0680d14d86514d943e55e64dcdfbf8820003ea73"} Mar 19 20:34:46 crc kubenswrapper[4799]: I0319 20:34:46.673149 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" podStartSLOduration=2.117307388 podStartE2EDuration="2.673123364s" podCreationTimestamp="2026-03-19 20:34:44 +0000 UTC" firstStartedPulling="2026-03-19 20:34:45.695467397 +0000 UTC m=+1763.301420479" lastFinishedPulling="2026-03-19 20:34:46.251283373 +0000 UTC m=+1763.857236455" observedRunningTime="2026-03-19 20:34:46.67116456 +0000 UTC m=+1764.277117652" watchObservedRunningTime="2026-03-19 20:34:46.673123364 +0000 UTC m=+1764.279076446" Mar 19 20:34:54 crc kubenswrapper[4799]: I0319 20:34:54.047527 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2msv6"] Mar 19 20:34:54 crc kubenswrapper[4799]: I0319 20:34:54.059274 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2msv6"] Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.045106 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lccb6"] Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.062849 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e7c2-account-create-update-szzhc"] Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.080279 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lccb6"] Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.090830 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e7c2-account-create-update-szzhc"] Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.127751 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a750507b-f4c5-4327-ad80-5b20b8740bef" path="/var/lib/kubelet/pods/a750507b-f4c5-4327-ad80-5b20b8740bef/volumes" Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.128875 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb48c3f-ff08-41c9-b3b9-6d974dc85797" path="/var/lib/kubelet/pods/bcb48c3f-ff08-41c9-b3b9-6d974dc85797/volumes" Mar 19 20:34:55 crc kubenswrapper[4799]: I0319 20:34:55.129701 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c920492a-2fc5-4531-9eff-538a52f5d3de" path="/var/lib/kubelet/pods/c920492a-2fc5-4531-9eff-538a52f5d3de/volumes" Mar 19 20:34:56 crc kubenswrapper[4799]: I0319 20:34:56.047680 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8skh8"] Mar 19 20:34:56 crc kubenswrapper[4799]: I0319 20:34:56.068717 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e217-account-create-update-ww4cp"] Mar 19 20:34:56 crc kubenswrapper[4799]: I0319 20:34:56.079906 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-de1e-account-create-update-t6tzl"] Mar 19 20:34:56 crc kubenswrapper[4799]: I0319 20:34:56.088469 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e217-account-create-update-ww4cp"] Mar 19 20:34:56 crc kubenswrapper[4799]: I0319 20:34:56.096510 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8skh8"] Mar 19 20:34:56 crc kubenswrapper[4799]: I0319 20:34:56.104649 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-de1e-account-create-update-t6tzl"] Mar 19 20:34:57 crc kubenswrapper[4799]: I0319 20:34:57.131291 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f679b2-054e-4fd3-9b93-276ffd1a45b6" path="/var/lib/kubelet/pods/19f679b2-054e-4fd3-9b93-276ffd1a45b6/volumes" Mar 19 20:34:57 crc kubenswrapper[4799]: I0319 20:34:57.133254 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630a3885-5144-4f12-9488-b51346e29dee" path="/var/lib/kubelet/pods/630a3885-5144-4f12-9488-b51346e29dee/volumes" Mar 19 20:34:57 crc kubenswrapper[4799]: I0319 20:34:57.134646 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819444ce-2f1f-4970-b22d-80cd3bf90a1d" path="/var/lib/kubelet/pods/819444ce-2f1f-4970-b22d-80cd3bf90a1d/volumes" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.318155 4799 scope.go:117] "RemoveContainer" containerID="f7a314e611b6d56bace35ef21892d25f624e26145034811bfe59eeea49762ccf" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.338237 4799 scope.go:117] "RemoveContainer" containerID="1445ff62688485c809fec8edc4c9dd52aee914ac649fe3ffccb276995babf78b" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.397922 4799 scope.go:117] "RemoveContainer" containerID="2403c6a66e3746f55049f0e96b136599fa3c88bdaf68f346b55229bc68180da3" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.445830 4799 scope.go:117] "RemoveContainer" containerID="da2eda38a0f2530838cfdd31a9c72be9d2a8499a055ed7272cb62f0aad9943a3" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.494494 4799 scope.go:117] "RemoveContainer" containerID="2b28f11163b433e88869f33aef128502e7a4e9631d4b643afd00365ecad9b182" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.554978 4799 scope.go:117] "RemoveContainer" containerID="d0d006f7cd56602aa308c1ec73c132a5490886e9b3ab340eccc945dbafcbe5ba" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.595103 4799 scope.go:117] "RemoveContainer" containerID="b8f230c7dc9e803ce7e492826f25e0bb103a0b6ffcfc36c14c3ad57a4b6ee6b9" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.640105 4799 scope.go:117] "RemoveContainer" containerID="95bb59016b433dab2f7105e6e7671f83f86e600bec3c153f73944b9e9e6878f3" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.687438 4799 scope.go:117] "RemoveContainer" containerID="26d980a05014aaec753f74f5670f922f956324227822f53c0a2b4d0d108325d8" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.706338 4799 scope.go:117] "RemoveContainer" containerID="271bb4327b0578e5b0ddc914d0d2bc1565e6e54463224880ac06d414ef06cb9a" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.746792 4799 scope.go:117] "RemoveContainer" containerID="9eb2958c467d6f996313ad694f7fd17a3da3991d9bc2c30eeeb5e8bd600e1cb0" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.756022 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.756070 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.756135 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.757141 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:34:58 crc kubenswrapper[4799]: I0319 20:34:58.757204 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" gracePeriod=600 Mar 19 20:34:58 crc kubenswrapper[4799]: E0319 20:34:58.880603 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:34:59 crc kubenswrapper[4799]: I0319 20:34:59.841820 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" exitCode=0 Mar 19 20:34:59 crc kubenswrapper[4799]: I0319 20:34:59.841869 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac"} Mar 19 20:34:59 crc kubenswrapper[4799]: I0319 20:34:59.842226 4799 scope.go:117] "RemoveContainer" containerID="09af92723404390c6609268ff888c50debb267c3f634bab822b68cda538b8c1c" Mar 19 20:34:59 crc kubenswrapper[4799]: I0319 20:34:59.843142 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:34:59 crc kubenswrapper[4799]: E0319 20:34:59.843526 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:35:15 crc kubenswrapper[4799]: I0319 20:35:15.116372 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:35:15 crc kubenswrapper[4799]: E0319 20:35:15.117479 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:35:24 crc kubenswrapper[4799]: I0319 20:35:24.054427 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bkvvs"] Mar 19 20:35:24 crc kubenswrapper[4799]: I0319 20:35:24.064105 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bkvvs"] Mar 19 20:35:25 crc kubenswrapper[4799]: I0319 20:35:25.137309 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9a9bf0-25f2-4716-bb99-8374f07ec1ff" path="/var/lib/kubelet/pods/1f9a9bf0-25f2-4716-bb99-8374f07ec1ff/volumes" Mar 19 20:35:27 crc kubenswrapper[4799]: I0319 20:35:27.116298 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:35:27 crc kubenswrapper[4799]: E0319 20:35:27.117268 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:35:35 crc kubenswrapper[4799]: I0319 20:35:35.232751 4799 generic.go:334] "Generic (PLEG): container finished" podID="e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" containerID="b3c81d0737a1566a94d5788435fc02a8e8cdaf99ee37a905a7e98a757b746637" exitCode=0 Mar 19 20:35:35 crc kubenswrapper[4799]: I0319 20:35:35.232872 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" event={"ID":"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d","Type":"ContainerDied","Data":"b3c81d0737a1566a94d5788435fc02a8e8cdaf99ee37a905a7e98a757b746637"} Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.706959 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.798786 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-inventory\") pod \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.798931 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-ssh-key-openstack-edpm-ipam\") pod \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.799084 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mscc8\" (UniqueName: \"kubernetes.io/projected/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-kube-api-access-mscc8\") pod \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\" (UID: \"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d\") " Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.807804 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-kube-api-access-mscc8" (OuterVolumeSpecName: "kube-api-access-mscc8") pod "e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" (UID: "e06b9a2d-57e1-4c4d-a08e-d0c2c343130d"). InnerVolumeSpecName "kube-api-access-mscc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.837002 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" (UID: "e06b9a2d-57e1-4c4d-a08e-d0c2c343130d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.841503 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-inventory" (OuterVolumeSpecName: "inventory") pod "e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" (UID: "e06b9a2d-57e1-4c4d-a08e-d0c2c343130d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.901622 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.902044 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:35:36 crc kubenswrapper[4799]: I0319 20:35:36.902065 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mscc8\" (UniqueName: \"kubernetes.io/projected/e06b9a2d-57e1-4c4d-a08e-d0c2c343130d-kube-api-access-mscc8\") on node \"crc\" DevicePath \"\"" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.308173 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" event={"ID":"e06b9a2d-57e1-4c4d-a08e-d0c2c343130d","Type":"ContainerDied","Data":"d0dd710681591e9127c43e0c0680d14d86514d943e55e64dcdfbf8820003ea73"} Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.308261 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0dd710681591e9127c43e0c0680d14d86514d943e55e64dcdfbf8820003ea73" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.308291 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.406158 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2jrjr"] Mar 19 20:35:37 crc kubenswrapper[4799]: E0319 20:35:37.406808 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.406838 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.407183 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06b9a2d-57e1-4c4d-a08e-d0c2c343130d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.408499 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.411214 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.411371 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.411812 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.413058 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.419626 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2jrjr"] Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.517524 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxvd\" (UniqueName: \"kubernetes.io/projected/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-kube-api-access-7wxvd\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.517613 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.517800 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.619887 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.620123 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.620245 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxvd\" (UniqueName: \"kubernetes.io/projected/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-kube-api-access-7wxvd\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.628655 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.638720 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.643289 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxvd\" (UniqueName: \"kubernetes.io/projected/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-kube-api-access-7wxvd\") pod \"ssh-known-hosts-edpm-deployment-2jrjr\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:37 crc kubenswrapper[4799]: I0319 20:35:37.730883 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:38 crc kubenswrapper[4799]: I0319 20:35:38.134615 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2jrjr"] Mar 19 20:35:38 crc kubenswrapper[4799]: I0319 20:35:38.137920 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:35:38 crc kubenswrapper[4799]: I0319 20:35:38.321506 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" event={"ID":"c2e0b37a-955c-4332-bae2-6a7ffd2712f4","Type":"ContainerStarted","Data":"f7ed004ad0f5743f8903c03fe60ff92dfdddf2905f9f0670698b87f5cf6ca9de"} Mar 19 20:35:39 crc kubenswrapper[4799]: I0319 20:35:39.332466 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" event={"ID":"c2e0b37a-955c-4332-bae2-6a7ffd2712f4","Type":"ContainerStarted","Data":"2de6620a4f7b275a06fda6d19eaf814ffc4a1268165d341bba7c933eb74288c6"} Mar 19 20:35:39 crc kubenswrapper[4799]: I0319 20:35:39.358926 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" podStartSLOduration=1.927873694 podStartE2EDuration="2.35890431s" podCreationTimestamp="2026-03-19 20:35:37 +0000 UTC" firstStartedPulling="2026-03-19 20:35:38.1373408 +0000 UTC m=+1815.743293902" lastFinishedPulling="2026-03-19 20:35:38.568371436 +0000 UTC m=+1816.174324518" observedRunningTime="2026-03-19 20:35:39.354484858 +0000 UTC m=+1816.960437950" watchObservedRunningTime="2026-03-19 20:35:39.35890431 +0000 UTC m=+1816.964857392" Mar 19 20:35:42 crc kubenswrapper[4799]: I0319 20:35:42.116467 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:35:42 crc kubenswrapper[4799]: E0319 20:35:42.118866 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:35:45 crc kubenswrapper[4799]: I0319 20:35:45.048500 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-59gtz"] Mar 19 20:35:45 crc kubenswrapper[4799]: I0319 20:35:45.056768 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-59gtz"] Mar 19 20:35:45 crc kubenswrapper[4799]: I0319 20:35:45.136145 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1429cf-e6a9-4c99-af55-4ab7d80cd651" path="/var/lib/kubelet/pods/0c1429cf-e6a9-4c99-af55-4ab7d80cd651/volumes" Mar 19 20:35:46 crc kubenswrapper[4799]: I0319 20:35:46.038815 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ftd7g"] Mar 19 20:35:46 crc kubenswrapper[4799]: I0319 20:35:46.050437 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ftd7g"] Mar 19 20:35:46 crc kubenswrapper[4799]: I0319 20:35:46.409587 4799 generic.go:334] "Generic (PLEG): container finished" podID="c2e0b37a-955c-4332-bae2-6a7ffd2712f4" containerID="2de6620a4f7b275a06fda6d19eaf814ffc4a1268165d341bba7c933eb74288c6" exitCode=0 Mar 19 20:35:46 crc kubenswrapper[4799]: I0319 20:35:46.409662 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" event={"ID":"c2e0b37a-955c-4332-bae2-6a7ffd2712f4","Type":"ContainerDied","Data":"2de6620a4f7b275a06fda6d19eaf814ffc4a1268165d341bba7c933eb74288c6"} Mar 19 20:35:47 crc kubenswrapper[4799]: I0319 20:35:47.138067 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7edd0f4d-c473-43d0-acad-40ba65e1ae5b" path="/var/lib/kubelet/pods/7edd0f4d-c473-43d0-acad-40ba65e1ae5b/volumes" Mar 19 20:35:47 crc kubenswrapper[4799]: I0319 20:35:47.907656 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.058157 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-inventory-0\") pod \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.058316 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxvd\" (UniqueName: \"kubernetes.io/projected/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-kube-api-access-7wxvd\") pod \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.058506 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-ssh-key-openstack-edpm-ipam\") pod \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\" (UID: \"c2e0b37a-955c-4332-bae2-6a7ffd2712f4\") " Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.067611 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-kube-api-access-7wxvd" (OuterVolumeSpecName: "kube-api-access-7wxvd") pod "c2e0b37a-955c-4332-bae2-6a7ffd2712f4" (UID: "c2e0b37a-955c-4332-bae2-6a7ffd2712f4"). InnerVolumeSpecName "kube-api-access-7wxvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.107462 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c2e0b37a-955c-4332-bae2-6a7ffd2712f4" (UID: "c2e0b37a-955c-4332-bae2-6a7ffd2712f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.116414 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c2e0b37a-955c-4332-bae2-6a7ffd2712f4" (UID: "c2e0b37a-955c-4332-bae2-6a7ffd2712f4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.161046 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxvd\" (UniqueName: \"kubernetes.io/projected/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-kube-api-access-7wxvd\") on node \"crc\" DevicePath \"\"" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.161080 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.161090 4799 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c2e0b37a-955c-4332-bae2-6a7ffd2712f4-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.437162 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" event={"ID":"c2e0b37a-955c-4332-bae2-6a7ffd2712f4","Type":"ContainerDied","Data":"f7ed004ad0f5743f8903c03fe60ff92dfdddf2905f9f0670698b87f5cf6ca9de"} Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.437614 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7ed004ad0f5743f8903c03fe60ff92dfdddf2905f9f0670698b87f5cf6ca9de" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.437628 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2jrjr" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.535473 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6"] Mar 19 20:35:48 crc kubenswrapper[4799]: E0319 20:35:48.536497 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e0b37a-955c-4332-bae2-6a7ffd2712f4" containerName="ssh-known-hosts-edpm-deployment" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.536658 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e0b37a-955c-4332-bae2-6a7ffd2712f4" containerName="ssh-known-hosts-edpm-deployment" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.537191 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e0b37a-955c-4332-bae2-6a7ffd2712f4" containerName="ssh-known-hosts-edpm-deployment" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.538532 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.541600 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.541681 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.541806 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.542222 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.559657 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6"] Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.674209 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.674326 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.674366 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfxtg\" (UniqueName: \"kubernetes.io/projected/2c78b6c2-a079-427d-9ebe-f8250777e6bd-kube-api-access-xfxtg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.779187 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.779249 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.779273 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfxtg\" (UniqueName: \"kubernetes.io/projected/2c78b6c2-a079-427d-9ebe-f8250777e6bd-kube-api-access-xfxtg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.783487 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.783648 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.816486 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfxtg\" (UniqueName: \"kubernetes.io/projected/2c78b6c2-a079-427d-9ebe-f8250777e6bd-kube-api-access-xfxtg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cxss6\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:48 crc kubenswrapper[4799]: I0319 20:35:48.868316 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:35:49 crc kubenswrapper[4799]: I0319 20:35:49.471155 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6"] Mar 19 20:35:50 crc kubenswrapper[4799]: I0319 20:35:50.457323 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" event={"ID":"2c78b6c2-a079-427d-9ebe-f8250777e6bd","Type":"ContainerStarted","Data":"c6edd149de6af7249673dcc19d76ec8e5e79681d06f44f444bd2cf13a019bfd4"} Mar 19 20:35:50 crc kubenswrapper[4799]: I0319 20:35:50.457369 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" event={"ID":"2c78b6c2-a079-427d-9ebe-f8250777e6bd","Type":"ContainerStarted","Data":"08fe2195ba03c7e10229f3d6df193dceb6f6697ed25a8c58e81c6cec78de4b25"} Mar 19 20:35:50 crc kubenswrapper[4799]: I0319 20:35:50.483919 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" podStartSLOduration=1.9444743020000002 podStartE2EDuration="2.483897944s" podCreationTimestamp="2026-03-19 20:35:48 +0000 UTC" firstStartedPulling="2026-03-19 20:35:49.482292425 +0000 UTC m=+1827.088245497" lastFinishedPulling="2026-03-19 20:35:50.021716057 +0000 UTC m=+1827.627669139" observedRunningTime="2026-03-19 20:35:50.475264835 +0000 UTC m=+1828.081217927" watchObservedRunningTime="2026-03-19 20:35:50.483897944 +0000 UTC m=+1828.089851026" Mar 19 20:35:53 crc kubenswrapper[4799]: I0319 20:35:53.127766 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:35:53 crc kubenswrapper[4799]: E0319 20:35:53.128812 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:35:58 crc kubenswrapper[4799]: I0319 20:35:58.555953 4799 generic.go:334] "Generic (PLEG): container finished" podID="2c78b6c2-a079-427d-9ebe-f8250777e6bd" containerID="c6edd149de6af7249673dcc19d76ec8e5e79681d06f44f444bd2cf13a019bfd4" exitCode=0 Mar 19 20:35:58 crc kubenswrapper[4799]: I0319 20:35:58.556010 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" event={"ID":"2c78b6c2-a079-427d-9ebe-f8250777e6bd","Type":"ContainerDied","Data":"c6edd149de6af7249673dcc19d76ec8e5e79681d06f44f444bd2cf13a019bfd4"} Mar 19 20:35:58 crc kubenswrapper[4799]: I0319 20:35:58.974038 4799 scope.go:117] "RemoveContainer" containerID="f651de3286bdd3b79e81fe20e4d30b50031f8bcad9801c805b6e7ade9fbe3c1c" Mar 19 20:35:59 crc kubenswrapper[4799]: I0319 20:35:59.048733 4799 scope.go:117] "RemoveContainer" containerID="a893e648474e84340cee07704a112602321168fea4d163230837a5e2da3bf0ef" Mar 19 20:35:59 crc kubenswrapper[4799]: I0319 20:35:59.103143 4799 scope.go:117] "RemoveContainer" containerID="6a4e2db20e948f9df36747bb9ee78d065c8a7e573258544955ca98c780dbd65a" Mar 19 20:35:59 crc kubenswrapper[4799]: I0319 20:35:59.975800 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.029162 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfxtg\" (UniqueName: \"kubernetes.io/projected/2c78b6c2-a079-427d-9ebe-f8250777e6bd-kube-api-access-xfxtg\") pod \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.029588 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-ssh-key-openstack-edpm-ipam\") pod \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.029815 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-inventory\") pod \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\" (UID: \"2c78b6c2-a079-427d-9ebe-f8250777e6bd\") " Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.037213 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c78b6c2-a079-427d-9ebe-f8250777e6bd-kube-api-access-xfxtg" (OuterVolumeSpecName: "kube-api-access-xfxtg") pod "2c78b6c2-a079-427d-9ebe-f8250777e6bd" (UID: "2c78b6c2-a079-427d-9ebe-f8250777e6bd"). InnerVolumeSpecName "kube-api-access-xfxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.058850 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-inventory" (OuterVolumeSpecName: "inventory") pod "2c78b6c2-a079-427d-9ebe-f8250777e6bd" (UID: "2c78b6c2-a079-427d-9ebe-f8250777e6bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.064587 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c78b6c2-a079-427d-9ebe-f8250777e6bd" (UID: "2c78b6c2-a079-427d-9ebe-f8250777e6bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.134963 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.135543 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c78b6c2-a079-427d-9ebe-f8250777e6bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.135559 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfxtg\" (UniqueName: \"kubernetes.io/projected/2c78b6c2-a079-427d-9ebe-f8250777e6bd-kube-api-access-xfxtg\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.137498 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565876-kcmfl"] Mar 19 20:36:00 crc kubenswrapper[4799]: E0319 20:36:00.137986 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c78b6c2-a079-427d-9ebe-f8250777e6bd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.138066 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c78b6c2-a079-427d-9ebe-f8250777e6bd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.138315 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c78b6c2-a079-427d-9ebe-f8250777e6bd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.139056 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.141920 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.142132 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.143653 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.149218 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565876-kcmfl"] Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.237487 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2h6\" (UniqueName: \"kubernetes.io/projected/32d48006-ae79-434d-80fe-02d1d3884e33-kube-api-access-8g2h6\") pod \"auto-csr-approver-29565876-kcmfl\" (UID: \"32d48006-ae79-434d-80fe-02d1d3884e33\") " pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.338770 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2h6\" (UniqueName: \"kubernetes.io/projected/32d48006-ae79-434d-80fe-02d1d3884e33-kube-api-access-8g2h6\") pod \"auto-csr-approver-29565876-kcmfl\" (UID: \"32d48006-ae79-434d-80fe-02d1d3884e33\") " pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.360567 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2h6\" (UniqueName: \"kubernetes.io/projected/32d48006-ae79-434d-80fe-02d1d3884e33-kube-api-access-8g2h6\") pod \"auto-csr-approver-29565876-kcmfl\" (UID: \"32d48006-ae79-434d-80fe-02d1d3884e33\") " pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.458511 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.580226 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" event={"ID":"2c78b6c2-a079-427d-9ebe-f8250777e6bd","Type":"ContainerDied","Data":"08fe2195ba03c7e10229f3d6df193dceb6f6697ed25a8c58e81c6cec78de4b25"} Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.580275 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08fe2195ba03c7e10229f3d6df193dceb6f6697ed25a8c58e81c6cec78de4b25" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.580305 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cxss6" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.686057 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh"] Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.687745 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.690035 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.690265 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.690660 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.690991 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.709272 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh"] Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.747840 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.748107 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv86h\" (UniqueName: \"kubernetes.io/projected/d20cbc69-15fd-45a3-95f9-d29078eb55c7-kube-api-access-bv86h\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.748318 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.850727 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.850846 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv86h\" (UniqueName: \"kubernetes.io/projected/d20cbc69-15fd-45a3-95f9-d29078eb55c7-kube-api-access-bv86h\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.850955 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.855020 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.855514 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:00 crc kubenswrapper[4799]: I0319 20:36:00.869801 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv86h\" (UniqueName: \"kubernetes.io/projected/d20cbc69-15fd-45a3-95f9-d29078eb55c7-kube-api-access-bv86h\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:01 crc kubenswrapper[4799]: I0319 20:36:01.005299 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:01 crc kubenswrapper[4799]: I0319 20:36:01.048301 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565876-kcmfl"] Mar 19 20:36:01 crc kubenswrapper[4799]: I0319 20:36:01.598137 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" event={"ID":"32d48006-ae79-434d-80fe-02d1d3884e33","Type":"ContainerStarted","Data":"e793264f050df79af25e051b25bd0b5b14f29d706e7fa794e17b0ffb8a27a38d"} Mar 19 20:36:01 crc kubenswrapper[4799]: I0319 20:36:01.653004 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh"] Mar 19 20:36:02 crc kubenswrapper[4799]: I0319 20:36:02.615736 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" event={"ID":"d20cbc69-15fd-45a3-95f9-d29078eb55c7","Type":"ContainerStarted","Data":"3d0aac5c43fcde271cdcfee7d5dc0d7dae92ff1514fd8c2f1ceb9f551b36c483"} Mar 19 20:36:02 crc kubenswrapper[4799]: I0319 20:36:02.616374 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" event={"ID":"d20cbc69-15fd-45a3-95f9-d29078eb55c7","Type":"ContainerStarted","Data":"2cd8d1c32d3e59634908fa3c7d8d24439c48676801bc0dba3156e562d582b09c"} Mar 19 20:36:02 crc kubenswrapper[4799]: I0319 20:36:02.653962 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" podStartSLOduration=2.141356577 podStartE2EDuration="2.653934537s" podCreationTimestamp="2026-03-19 20:36:00 +0000 UTC" firstStartedPulling="2026-03-19 20:36:01.676142577 +0000 UTC m=+1839.282095649" lastFinishedPulling="2026-03-19 20:36:02.188720537 +0000 UTC m=+1839.794673609" observedRunningTime="2026-03-19 20:36:02.636265478 +0000 UTC m=+1840.242218550" watchObservedRunningTime="2026-03-19 20:36:02.653934537 +0000 UTC m=+1840.259887629" Mar 19 20:36:03 crc kubenswrapper[4799]: I0319 20:36:03.642262 4799 generic.go:334] "Generic (PLEG): container finished" podID="32d48006-ae79-434d-80fe-02d1d3884e33" containerID="3b9c41610c105beb23934c157217ab50fea825905fc7c8485dd4ec3cb1b7dace" exitCode=0 Mar 19 20:36:03 crc kubenswrapper[4799]: I0319 20:36:03.642531 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" event={"ID":"32d48006-ae79-434d-80fe-02d1d3884e33","Type":"ContainerDied","Data":"3b9c41610c105beb23934c157217ab50fea825905fc7c8485dd4ec3cb1b7dace"} Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.149279 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.246073 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g2h6\" (UniqueName: \"kubernetes.io/projected/32d48006-ae79-434d-80fe-02d1d3884e33-kube-api-access-8g2h6\") pod \"32d48006-ae79-434d-80fe-02d1d3884e33\" (UID: \"32d48006-ae79-434d-80fe-02d1d3884e33\") " Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.251968 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d48006-ae79-434d-80fe-02d1d3884e33-kube-api-access-8g2h6" (OuterVolumeSpecName: "kube-api-access-8g2h6") pod "32d48006-ae79-434d-80fe-02d1d3884e33" (UID: "32d48006-ae79-434d-80fe-02d1d3884e33"). InnerVolumeSpecName "kube-api-access-8g2h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.348045 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g2h6\" (UniqueName: \"kubernetes.io/projected/32d48006-ae79-434d-80fe-02d1d3884e33-kube-api-access-8g2h6\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.666929 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.666897 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565876-kcmfl" event={"ID":"32d48006-ae79-434d-80fe-02d1d3884e33","Type":"ContainerDied","Data":"e793264f050df79af25e051b25bd0b5b14f29d706e7fa794e17b0ffb8a27a38d"} Mar 19 20:36:05 crc kubenswrapper[4799]: I0319 20:36:05.667231 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e793264f050df79af25e051b25bd0b5b14f29d706e7fa794e17b0ffb8a27a38d" Mar 19 20:36:06 crc kubenswrapper[4799]: I0319 20:36:06.116552 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:36:06 crc kubenswrapper[4799]: E0319 20:36:06.117284 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:36:06 crc kubenswrapper[4799]: I0319 20:36:06.237018 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-p96cm"] Mar 19 20:36:06 crc kubenswrapper[4799]: I0319 20:36:06.250002 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-p96cm"] Mar 19 20:36:07 crc kubenswrapper[4799]: I0319 20:36:07.136352 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3f2053-f94c-40f3-876d-0e3b8ca33c77" path="/var/lib/kubelet/pods/2b3f2053-f94c-40f3-876d-0e3b8ca33c77/volumes" Mar 19 20:36:11 crc kubenswrapper[4799]: I0319 20:36:11.755101 4799 generic.go:334] "Generic (PLEG): container finished" podID="d20cbc69-15fd-45a3-95f9-d29078eb55c7" containerID="3d0aac5c43fcde271cdcfee7d5dc0d7dae92ff1514fd8c2f1ceb9f551b36c483" exitCode=0 Mar 19 20:36:11 crc kubenswrapper[4799]: I0319 20:36:11.755648 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" event={"ID":"d20cbc69-15fd-45a3-95f9-d29078eb55c7","Type":"ContainerDied","Data":"3d0aac5c43fcde271cdcfee7d5dc0d7dae92ff1514fd8c2f1ceb9f551b36c483"} Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.243119 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.370778 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv86h\" (UniqueName: \"kubernetes.io/projected/d20cbc69-15fd-45a3-95f9-d29078eb55c7-kube-api-access-bv86h\") pod \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.370891 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-ssh-key-openstack-edpm-ipam\") pod \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.370977 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-inventory\") pod \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\" (UID: \"d20cbc69-15fd-45a3-95f9-d29078eb55c7\") " Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.377233 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20cbc69-15fd-45a3-95f9-d29078eb55c7-kube-api-access-bv86h" (OuterVolumeSpecName: "kube-api-access-bv86h") pod "d20cbc69-15fd-45a3-95f9-d29078eb55c7" (UID: "d20cbc69-15fd-45a3-95f9-d29078eb55c7"). InnerVolumeSpecName "kube-api-access-bv86h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.406427 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-inventory" (OuterVolumeSpecName: "inventory") pod "d20cbc69-15fd-45a3-95f9-d29078eb55c7" (UID: "d20cbc69-15fd-45a3-95f9-d29078eb55c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.424662 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d20cbc69-15fd-45a3-95f9-d29078eb55c7" (UID: "d20cbc69-15fd-45a3-95f9-d29078eb55c7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.474283 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv86h\" (UniqueName: \"kubernetes.io/projected/d20cbc69-15fd-45a3-95f9-d29078eb55c7-kube-api-access-bv86h\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.474326 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.474339 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20cbc69-15fd-45a3-95f9-d29078eb55c7-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.790735 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" event={"ID":"d20cbc69-15fd-45a3-95f9-d29078eb55c7","Type":"ContainerDied","Data":"2cd8d1c32d3e59634908fa3c7d8d24439c48676801bc0dba3156e562d582b09c"} Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.791150 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd8d1c32d3e59634908fa3c7d8d24439c48676801bc0dba3156e562d582b09c" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.790778 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.906110 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852"] Mar 19 20:36:13 crc kubenswrapper[4799]: E0319 20:36:13.906469 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d48006-ae79-434d-80fe-02d1d3884e33" containerName="oc" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.906484 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d48006-ae79-434d-80fe-02d1d3884e33" containerName="oc" Mar 19 20:36:13 crc kubenswrapper[4799]: E0319 20:36:13.906524 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20cbc69-15fd-45a3-95f9-d29078eb55c7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.906534 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20cbc69-15fd-45a3-95f9-d29078eb55c7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.906692 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d48006-ae79-434d-80fe-02d1d3884e33" containerName="oc" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.906705 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20cbc69-15fd-45a3-95f9-d29078eb55c7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.907249 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.912604 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.912711 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.912789 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.913047 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.912979 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.913051 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.913117 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.916719 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:36:13 crc kubenswrapper[4799]: I0319 20:36:13.943344 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852"] Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086270 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9m9\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-kube-api-access-jh9m9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086318 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086353 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086471 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086501 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086549 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086576 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086601 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.086627 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.087123 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.087215 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.087332 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.087437 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.087493 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189190 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189258 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189291 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189324 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189358 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189378 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189440 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189462 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189495 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9m9\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-kube-api-access-jh9m9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189517 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189544 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189580 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189605 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.189666 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.194071 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.195293 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.195944 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.196743 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.196778 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.197671 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.198320 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.199042 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.199187 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.199586 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.199698 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.198301 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.209855 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9m9\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-kube-api-access-jh9m9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.215205 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nc852\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.228184 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:14 crc kubenswrapper[4799]: I0319 20:36:14.815154 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852"] Mar 19 20:36:15 crc kubenswrapper[4799]: I0319 20:36:15.828831 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" event={"ID":"9e20098d-7fb1-4aba-b460-1440751e6dc2","Type":"ContainerStarted","Data":"50ba4fc03be2801b40ab60a56296ae801f2ee8e8583c60679dfb80791173ae6a"} Mar 19 20:36:15 crc kubenswrapper[4799]: I0319 20:36:15.829107 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" event={"ID":"9e20098d-7fb1-4aba-b460-1440751e6dc2","Type":"ContainerStarted","Data":"8de6941c6c134611d89b3ca0220b87ab18667511a1782ad76e9273921b2e0aaa"} Mar 19 20:36:15 crc kubenswrapper[4799]: I0319 20:36:15.855190 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" podStartSLOduration=2.141412476 podStartE2EDuration="2.855168667s" podCreationTimestamp="2026-03-19 20:36:13 +0000 UTC" firstStartedPulling="2026-03-19 20:36:14.825171653 +0000 UTC m=+1852.431124725" lastFinishedPulling="2026-03-19 20:36:15.538927844 +0000 UTC m=+1853.144880916" observedRunningTime="2026-03-19 20:36:15.845442898 +0000 UTC m=+1853.451395970" watchObservedRunningTime="2026-03-19 20:36:15.855168667 +0000 UTC m=+1853.461121749" Mar 19 20:36:17 crc kubenswrapper[4799]: I0319 20:36:17.116969 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:36:17 crc kubenswrapper[4799]: E0319 20:36:17.117451 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.382244 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhppw"] Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.384574 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.418746 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhppw"] Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.489065 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct556\" (UniqueName: \"kubernetes.io/projected/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-kube-api-access-ct556\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.489117 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-catalog-content\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.489254 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-utilities\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.591055 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-utilities\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.591259 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct556\" (UniqueName: \"kubernetes.io/projected/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-kube-api-access-ct556\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.591298 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-catalog-content\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.591748 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-utilities\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.591779 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-catalog-content\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.621221 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct556\" (UniqueName: \"kubernetes.io/projected/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-kube-api-access-ct556\") pod \"community-operators-hhppw\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:18 crc kubenswrapper[4799]: I0319 20:36:18.708761 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:19 crc kubenswrapper[4799]: I0319 20:36:19.176484 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhppw"] Mar 19 20:36:19 crc kubenswrapper[4799]: I0319 20:36:19.912776 4799 generic.go:334] "Generic (PLEG): container finished" podID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerID="cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798" exitCode=0 Mar 19 20:36:19 crc kubenswrapper[4799]: I0319 20:36:19.913191 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhppw" event={"ID":"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65","Type":"ContainerDied","Data":"cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798"} Mar 19 20:36:19 crc kubenswrapper[4799]: I0319 20:36:19.913230 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhppw" event={"ID":"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65","Type":"ContainerStarted","Data":"4b37d151caae96c653b3d9062e325dd4259bd07e8c724c4199d7ae71e879e19f"} Mar 19 20:36:21 crc kubenswrapper[4799]: I0319 20:36:21.940164 4799 generic.go:334] "Generic (PLEG): container finished" podID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerID="9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27" exitCode=0 Mar 19 20:36:21 crc kubenswrapper[4799]: I0319 20:36:21.940247 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhppw" event={"ID":"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65","Type":"ContainerDied","Data":"9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27"} Mar 19 20:36:22 crc kubenswrapper[4799]: I0319 20:36:22.952679 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhppw" event={"ID":"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65","Type":"ContainerStarted","Data":"a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f"} Mar 19 20:36:22 crc kubenswrapper[4799]: I0319 20:36:22.986751 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhppw" podStartSLOduration=2.549604351 podStartE2EDuration="4.986725984s" podCreationTimestamp="2026-03-19 20:36:18 +0000 UTC" firstStartedPulling="2026-03-19 20:36:19.92266906 +0000 UTC m=+1857.528622162" lastFinishedPulling="2026-03-19 20:36:22.359790713 +0000 UTC m=+1859.965743795" observedRunningTime="2026-03-19 20:36:22.971116343 +0000 UTC m=+1860.577069445" watchObservedRunningTime="2026-03-19 20:36:22.986725984 +0000 UTC m=+1860.592679056" Mar 19 20:36:28 crc kubenswrapper[4799]: I0319 20:36:28.117143 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:36:28 crc kubenswrapper[4799]: E0319 20:36:28.118539 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:36:28 crc kubenswrapper[4799]: I0319 20:36:28.709031 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:28 crc kubenswrapper[4799]: I0319 20:36:28.709076 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:28 crc kubenswrapper[4799]: I0319 20:36:28.765089 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:29 crc kubenswrapper[4799]: I0319 20:36:29.086024 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:30 crc kubenswrapper[4799]: I0319 20:36:30.050666 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-c8765"] Mar 19 20:36:30 crc kubenswrapper[4799]: I0319 20:36:30.062740 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-c8765"] Mar 19 20:36:30 crc kubenswrapper[4799]: I0319 20:36:30.118850 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhppw"] Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.038449 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhppw" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="registry-server" containerID="cri-o://a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f" gracePeriod=2 Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.146516 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8149a1d2-5a92-4549-921c-c6a14131c0c7" path="/var/lib/kubelet/pods/8149a1d2-5a92-4549-921c-c6a14131c0c7/volumes" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.572642 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.610711 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct556\" (UniqueName: \"kubernetes.io/projected/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-kube-api-access-ct556\") pod \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.610843 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-utilities\") pod \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.610988 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-catalog-content\") pod \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\" (UID: \"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65\") " Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.611883 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-utilities" (OuterVolumeSpecName: "utilities") pod "c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" (UID: "c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.619577 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-kube-api-access-ct556" (OuterVolumeSpecName: "kube-api-access-ct556") pod "c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" (UID: "c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65"). InnerVolumeSpecName "kube-api-access-ct556". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.714044 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct556\" (UniqueName: \"kubernetes.io/projected/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-kube-api-access-ct556\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.714079 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.861406 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" (UID: "c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:36:31 crc kubenswrapper[4799]: I0319 20:36:31.918195 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.057186 4799 generic.go:334] "Generic (PLEG): container finished" podID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerID="a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f" exitCode=0 Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.057257 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhppw" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.057261 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhppw" event={"ID":"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65","Type":"ContainerDied","Data":"a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f"} Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.057326 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhppw" event={"ID":"c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65","Type":"ContainerDied","Data":"4b37d151caae96c653b3d9062e325dd4259bd07e8c724c4199d7ae71e879e19f"} Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.057378 4799 scope.go:117] "RemoveContainer" containerID="a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.091983 4799 scope.go:117] "RemoveContainer" containerID="9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.093823 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhppw"] Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.107697 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhppw"] Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.116273 4799 scope.go:117] "RemoveContainer" containerID="cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.159085 4799 scope.go:117] "RemoveContainer" containerID="a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f" Mar 19 20:36:32 crc kubenswrapper[4799]: E0319 20:36:32.160799 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f\": container with ID starting with a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f not found: ID does not exist" containerID="a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.160860 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f"} err="failed to get container status \"a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f\": rpc error: code = NotFound desc = could not find container \"a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f\": container with ID starting with a9eb4ddba280226472ab3031290b4b74d6b08f3e99d9c42e64a078ceca93480f not found: ID does not exist" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.160895 4799 scope.go:117] "RemoveContainer" containerID="9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27" Mar 19 20:36:32 crc kubenswrapper[4799]: E0319 20:36:32.161512 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27\": container with ID starting with 9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27 not found: ID does not exist" containerID="9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.161726 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27"} err="failed to get container status \"9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27\": rpc error: code = NotFound desc = could not find container \"9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27\": container with ID starting with 9282834b870545295e3ceaa7a6713baba1ccd525cba580004966b537615bfc27 not found: ID does not exist" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.161926 4799 scope.go:117] "RemoveContainer" containerID="cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798" Mar 19 20:36:32 crc kubenswrapper[4799]: E0319 20:36:32.162709 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798\": container with ID starting with cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798 not found: ID does not exist" containerID="cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798" Mar 19 20:36:32 crc kubenswrapper[4799]: I0319 20:36:32.162766 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798"} err="failed to get container status \"cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798\": rpc error: code = NotFound desc = could not find container \"cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798\": container with ID starting with cb704debb1e55f8cf46b4a54cbff744ac9b2cdb1bd7e287b348ab53e12cdb798 not found: ID does not exist" Mar 19 20:36:33 crc kubenswrapper[4799]: I0319 20:36:33.134141 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" path="/var/lib/kubelet/pods/c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65/volumes" Mar 19 20:36:42 crc kubenswrapper[4799]: I0319 20:36:42.116077 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:36:42 crc kubenswrapper[4799]: E0319 20:36:42.116925 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:36:54 crc kubenswrapper[4799]: I0319 20:36:54.329222 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e20098d-7fb1-4aba-b460-1440751e6dc2" containerID="50ba4fc03be2801b40ab60a56296ae801f2ee8e8583c60679dfb80791173ae6a" exitCode=0 Mar 19 20:36:54 crc kubenswrapper[4799]: I0319 20:36:54.329372 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" event={"ID":"9e20098d-7fb1-4aba-b460-1440751e6dc2","Type":"ContainerDied","Data":"50ba4fc03be2801b40ab60a56296ae801f2ee8e8583c60679dfb80791173ae6a"} Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.119225 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:36:55 crc kubenswrapper[4799]: E0319 20:36:55.119874 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.804019 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.930542 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-repo-setup-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.930663 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.930748 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-inventory\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.930835 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-nova-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.930911 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.930989 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931069 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ovn-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931133 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ssh-key-openstack-edpm-ipam\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931183 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-bootstrap-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931325 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931472 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-telemetry-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931620 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-neutron-metadata-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931725 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh9m9\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-kube-api-access-jh9m9\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.931773 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-libvirt-combined-ca-bundle\") pod \"9e20098d-7fb1-4aba-b460-1440751e6dc2\" (UID: \"9e20098d-7fb1-4aba-b460-1440751e6dc2\") " Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.938413 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.940338 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.941589 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.941871 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.943211 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.944989 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.947213 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-kube-api-access-jh9m9" (OuterVolumeSpecName: "kube-api-access-jh9m9") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "kube-api-access-jh9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.947826 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.948866 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.949650 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.950022 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.950973 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.966251 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:55 crc kubenswrapper[4799]: I0319 20:36:55.989703 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-inventory" (OuterVolumeSpecName: "inventory") pod "9e20098d-7fb1-4aba-b460-1440751e6dc2" (UID: "9e20098d-7fb1-4aba-b460-1440751e6dc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035039 4799 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035090 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh9m9\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-kube-api-access-jh9m9\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035110 4799 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035129 4799 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035150 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035172 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035189 4799 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035207 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035230 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035248 4799 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035266 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035283 4799 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035329 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9e20098d-7fb1-4aba-b460-1440751e6dc2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.035347 4799 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e20098d-7fb1-4aba-b460-1440751e6dc2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.354822 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" event={"ID":"9e20098d-7fb1-4aba-b460-1440751e6dc2","Type":"ContainerDied","Data":"8de6941c6c134611d89b3ca0220b87ab18667511a1782ad76e9273921b2e0aaa"} Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.355301 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8de6941c6c134611d89b3ca0220b87ab18667511a1782ad76e9273921b2e0aaa" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.354904 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nc852" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.536342 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt"] Mar 19 20:36:56 crc kubenswrapper[4799]: E0319 20:36:56.536809 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="extract-utilities" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.536832 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="extract-utilities" Mar 19 20:36:56 crc kubenswrapper[4799]: E0319 20:36:56.536845 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e20098d-7fb1-4aba-b460-1440751e6dc2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.536852 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e20098d-7fb1-4aba-b460-1440751e6dc2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:56 crc kubenswrapper[4799]: E0319 20:36:56.536867 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="registry-server" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.536876 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="registry-server" Mar 19 20:36:56 crc kubenswrapper[4799]: E0319 20:36:56.536888 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="extract-content" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.536894 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="extract-content" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.537066 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e5912f-bdd8-4f7f-b0b3-e1ac5ac6fb65" containerName="registry-server" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.537086 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e20098d-7fb1-4aba-b460-1440751e6dc2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.538087 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.541046 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.541291 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.542676 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.543368 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.544226 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.556845 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt"] Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.650424 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b526ddd9-685d-42a9-8598-d5dd7710942a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.650485 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8zt\" (UniqueName: \"kubernetes.io/projected/b526ddd9-685d-42a9-8598-d5dd7710942a-kube-api-access-xc8zt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.650546 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.650571 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.650632 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.752627 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.752867 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b526ddd9-685d-42a9-8598-d5dd7710942a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.752921 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc8zt\" (UniqueName: \"kubernetes.io/projected/b526ddd9-685d-42a9-8598-d5dd7710942a-kube-api-access-xc8zt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.753019 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.753069 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.754989 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b526ddd9-685d-42a9-8598-d5dd7710942a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.758669 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.760351 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.760422 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.786480 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc8zt\" (UniqueName: \"kubernetes.io/projected/b526ddd9-685d-42a9-8598-d5dd7710942a-kube-api-access-xc8zt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9nbt\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:56 crc kubenswrapper[4799]: I0319 20:36:56.866352 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:36:57 crc kubenswrapper[4799]: I0319 20:36:57.522523 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt"] Mar 19 20:36:58 crc kubenswrapper[4799]: I0319 20:36:58.372865 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" event={"ID":"b526ddd9-685d-42a9-8598-d5dd7710942a","Type":"ContainerStarted","Data":"4670499587faa16b094ea58400d16b35ce3dd699403ca928c5a237ab4f7c196d"} Mar 19 20:36:58 crc kubenswrapper[4799]: I0319 20:36:58.373293 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" event={"ID":"b526ddd9-685d-42a9-8598-d5dd7710942a","Type":"ContainerStarted","Data":"3c584a1db901880ef48674fa6c11b4c0e9dde1fa9a825b86c47c230857f247a1"} Mar 19 20:36:58 crc kubenswrapper[4799]: I0319 20:36:58.399840 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" podStartSLOduration=1.96244816 podStartE2EDuration="2.399818108s" podCreationTimestamp="2026-03-19 20:36:56 +0000 UTC" firstStartedPulling="2026-03-19 20:36:57.531640402 +0000 UTC m=+1895.137593474" lastFinishedPulling="2026-03-19 20:36:57.96901034 +0000 UTC m=+1895.574963422" observedRunningTime="2026-03-19 20:36:58.393075982 +0000 UTC m=+1895.999029084" watchObservedRunningTime="2026-03-19 20:36:58.399818108 +0000 UTC m=+1896.005771190" Mar 19 20:36:59 crc kubenswrapper[4799]: I0319 20:36:59.198114 4799 scope.go:117] "RemoveContainer" containerID="8ce6a70a5e04e3d4f7c136b156a0d9a979f5cb3eef6074f7c3511c7bdb2ac4c6" Mar 19 20:36:59 crc kubenswrapper[4799]: I0319 20:36:59.268650 4799 scope.go:117] "RemoveContainer" containerID="d92a3732a5ec0c6a9848dc90b0df404e0a9525a581adf9b7b9de5b97f747ecca" Mar 19 20:37:07 crc kubenswrapper[4799]: I0319 20:37:07.117097 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:37:07 crc kubenswrapper[4799]: E0319 20:37:07.118167 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:37:18 crc kubenswrapper[4799]: I0319 20:37:18.116582 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:37:18 crc kubenswrapper[4799]: E0319 20:37:18.117597 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:37:31 crc kubenswrapper[4799]: I0319 20:37:31.116775 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:37:31 crc kubenswrapper[4799]: E0319 20:37:31.117638 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:37:43 crc kubenswrapper[4799]: I0319 20:37:43.126409 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:37:43 crc kubenswrapper[4799]: E0319 20:37:43.129683 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:37:58 crc kubenswrapper[4799]: I0319 20:37:58.117105 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:37:58 crc kubenswrapper[4799]: E0319 20:37:58.118010 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.159935 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565878-lb6vk"] Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.162962 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.165895 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.167285 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.167711 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.176646 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565878-lb6vk"] Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.329134 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l87pz\" (UniqueName: \"kubernetes.io/projected/99f7aa0e-6dd5-490e-ab10-29499246aeb6-kube-api-access-l87pz\") pod \"auto-csr-approver-29565878-lb6vk\" (UID: \"99f7aa0e-6dd5-490e-ab10-29499246aeb6\") " pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.431366 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l87pz\" (UniqueName: \"kubernetes.io/projected/99f7aa0e-6dd5-490e-ab10-29499246aeb6-kube-api-access-l87pz\") pod \"auto-csr-approver-29565878-lb6vk\" (UID: \"99f7aa0e-6dd5-490e-ab10-29499246aeb6\") " pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.456493 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l87pz\" (UniqueName: \"kubernetes.io/projected/99f7aa0e-6dd5-490e-ab10-29499246aeb6-kube-api-access-l87pz\") pod \"auto-csr-approver-29565878-lb6vk\" (UID: \"99f7aa0e-6dd5-490e-ab10-29499246aeb6\") " pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:00 crc kubenswrapper[4799]: I0319 20:38:00.487885 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:01 crc kubenswrapper[4799]: I0319 20:38:01.009596 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565878-lb6vk"] Mar 19 20:38:01 crc kubenswrapper[4799]: W0319 20:38:01.014433 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f7aa0e_6dd5_490e_ab10_29499246aeb6.slice/crio-521b0b6d6e6e953f4a4847932b428410b1c17673104efc7fdfbe06b0a6936282 WatchSource:0}: Error finding container 521b0b6d6e6e953f4a4847932b428410b1c17673104efc7fdfbe06b0a6936282: Status 404 returned error can't find the container with id 521b0b6d6e6e953f4a4847932b428410b1c17673104efc7fdfbe06b0a6936282 Mar 19 20:38:01 crc kubenswrapper[4799]: I0319 20:38:01.076202 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" event={"ID":"99f7aa0e-6dd5-490e-ab10-29499246aeb6","Type":"ContainerStarted","Data":"521b0b6d6e6e953f4a4847932b428410b1c17673104efc7fdfbe06b0a6936282"} Mar 19 20:38:03 crc kubenswrapper[4799]: I0319 20:38:03.096270 4799 generic.go:334] "Generic (PLEG): container finished" podID="99f7aa0e-6dd5-490e-ab10-29499246aeb6" containerID="87893d72c37db67468a4e9258e2ac164f1a0bdbe300405120bd9fea49bec9449" exitCode=0 Mar 19 20:38:03 crc kubenswrapper[4799]: I0319 20:38:03.096357 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" event={"ID":"99f7aa0e-6dd5-490e-ab10-29499246aeb6","Type":"ContainerDied","Data":"87893d72c37db67468a4e9258e2ac164f1a0bdbe300405120bd9fea49bec9449"} Mar 19 20:38:04 crc kubenswrapper[4799]: I0319 20:38:04.462239 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:04 crc kubenswrapper[4799]: I0319 20:38:04.612661 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87pz\" (UniqueName: \"kubernetes.io/projected/99f7aa0e-6dd5-490e-ab10-29499246aeb6-kube-api-access-l87pz\") pod \"99f7aa0e-6dd5-490e-ab10-29499246aeb6\" (UID: \"99f7aa0e-6dd5-490e-ab10-29499246aeb6\") " Mar 19 20:38:04 crc kubenswrapper[4799]: I0319 20:38:04.621352 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f7aa0e-6dd5-490e-ab10-29499246aeb6-kube-api-access-l87pz" (OuterVolumeSpecName: "kube-api-access-l87pz") pod "99f7aa0e-6dd5-490e-ab10-29499246aeb6" (UID: "99f7aa0e-6dd5-490e-ab10-29499246aeb6"). InnerVolumeSpecName "kube-api-access-l87pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:38:04 crc kubenswrapper[4799]: I0319 20:38:04.715548 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l87pz\" (UniqueName: \"kubernetes.io/projected/99f7aa0e-6dd5-490e-ab10-29499246aeb6-kube-api-access-l87pz\") on node \"crc\" DevicePath \"\"" Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.115143 4799 generic.go:334] "Generic (PLEG): container finished" podID="b526ddd9-685d-42a9-8598-d5dd7710942a" containerID="4670499587faa16b094ea58400d16b35ce3dd699403ca928c5a237ab4f7c196d" exitCode=0 Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.116703 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.129965 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" event={"ID":"b526ddd9-685d-42a9-8598-d5dd7710942a","Type":"ContainerDied","Data":"4670499587faa16b094ea58400d16b35ce3dd699403ca928c5a237ab4f7c196d"} Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.130010 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565878-lb6vk" event={"ID":"99f7aa0e-6dd5-490e-ab10-29499246aeb6","Type":"ContainerDied","Data":"521b0b6d6e6e953f4a4847932b428410b1c17673104efc7fdfbe06b0a6936282"} Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.130025 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="521b0b6d6e6e953f4a4847932b428410b1c17673104efc7fdfbe06b0a6936282" Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.551237 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565872-b9qbx"] Mar 19 20:38:05 crc kubenswrapper[4799]: I0319 20:38:05.560467 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565872-b9qbx"] Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.520348 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.662639 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc8zt\" (UniqueName: \"kubernetes.io/projected/b526ddd9-685d-42a9-8598-d5dd7710942a-kube-api-access-xc8zt\") pod \"b526ddd9-685d-42a9-8598-d5dd7710942a\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.662855 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ovn-combined-ca-bundle\") pod \"b526ddd9-685d-42a9-8598-d5dd7710942a\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.663953 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-inventory\") pod \"b526ddd9-685d-42a9-8598-d5dd7710942a\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.664063 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ssh-key-openstack-edpm-ipam\") pod \"b526ddd9-685d-42a9-8598-d5dd7710942a\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.664249 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b526ddd9-685d-42a9-8598-d5dd7710942a-ovncontroller-config-0\") pod \"b526ddd9-685d-42a9-8598-d5dd7710942a\" (UID: \"b526ddd9-685d-42a9-8598-d5dd7710942a\") " Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.670723 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b526ddd9-685d-42a9-8598-d5dd7710942a-kube-api-access-xc8zt" (OuterVolumeSpecName: "kube-api-access-xc8zt") pod "b526ddd9-685d-42a9-8598-d5dd7710942a" (UID: "b526ddd9-685d-42a9-8598-d5dd7710942a"). InnerVolumeSpecName "kube-api-access-xc8zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.678051 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b526ddd9-685d-42a9-8598-d5dd7710942a" (UID: "b526ddd9-685d-42a9-8598-d5dd7710942a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.706834 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b526ddd9-685d-42a9-8598-d5dd7710942a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b526ddd9-685d-42a9-8598-d5dd7710942a" (UID: "b526ddd9-685d-42a9-8598-d5dd7710942a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.713963 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-inventory" (OuterVolumeSpecName: "inventory") pod "b526ddd9-685d-42a9-8598-d5dd7710942a" (UID: "b526ddd9-685d-42a9-8598-d5dd7710942a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.725185 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b526ddd9-685d-42a9-8598-d5dd7710942a" (UID: "b526ddd9-685d-42a9-8598-d5dd7710942a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.766981 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.767010 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.767022 4799 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b526ddd9-685d-42a9-8598-d5dd7710942a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.767030 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc8zt\" (UniqueName: \"kubernetes.io/projected/b526ddd9-685d-42a9-8598-d5dd7710942a-kube-api-access-xc8zt\") on node \"crc\" DevicePath \"\"" Mar 19 20:38:06 crc kubenswrapper[4799]: I0319 20:38:06.767038 4799 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b526ddd9-685d-42a9-8598-d5dd7710942a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.136671 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.137330 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293" path="/var/lib/kubelet/pods/30dad9f4-9a6b-4adb-9b6a-5dc3d8eae293/volumes" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.139141 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9nbt" event={"ID":"b526ddd9-685d-42a9-8598-d5dd7710942a","Type":"ContainerDied","Data":"3c584a1db901880ef48674fa6c11b4c0e9dde1fa9a825b86c47c230857f247a1"} Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.139189 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c584a1db901880ef48674fa6c11b4c0e9dde1fa9a825b86c47c230857f247a1" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.332363 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6"] Mar 19 20:38:07 crc kubenswrapper[4799]: E0319 20:38:07.332995 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f7aa0e-6dd5-490e-ab10-29499246aeb6" containerName="oc" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.333024 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f7aa0e-6dd5-490e-ab10-29499246aeb6" containerName="oc" Mar 19 20:38:07 crc kubenswrapper[4799]: E0319 20:38:07.333068 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b526ddd9-685d-42a9-8598-d5dd7710942a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.333081 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b526ddd9-685d-42a9-8598-d5dd7710942a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.333373 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f7aa0e-6dd5-490e-ab10-29499246aeb6" containerName="oc" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.333491 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b526ddd9-685d-42a9-8598-d5dd7710942a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.334523 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.337951 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6"] Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.340066 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.340363 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.340696 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.340867 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.341047 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.341213 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.481090 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.481140 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.481337 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24hxb\" (UniqueName: \"kubernetes.io/projected/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-kube-api-access-24hxb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.481425 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.481769 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.481847 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.583860 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24hxb\" (UniqueName: \"kubernetes.io/projected/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-kube-api-access-24hxb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.583925 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.583949 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.583968 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.584012 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.584036 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.590045 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.590189 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.590208 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.590806 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.592416 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.607747 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24hxb\" (UniqueName: \"kubernetes.io/projected/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-kube-api-access-24hxb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:07 crc kubenswrapper[4799]: I0319 20:38:07.689001 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:38:08 crc kubenswrapper[4799]: I0319 20:38:08.333721 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6"] Mar 19 20:38:08 crc kubenswrapper[4799]: W0319 20:38:08.339340 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b384cd_034d_407f_a7f6_0b1b0ddffb4f.slice/crio-ab24f4167c91aebcc7a1a6906986b601ccc65aa862c2f838678a64bd7d307548 WatchSource:0}: Error finding container ab24f4167c91aebcc7a1a6906986b601ccc65aa862c2f838678a64bd7d307548: Status 404 returned error can't find the container with id ab24f4167c91aebcc7a1a6906986b601ccc65aa862c2f838678a64bd7d307548 Mar 19 20:38:09 crc kubenswrapper[4799]: I0319 20:38:09.160695 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" event={"ID":"28b384cd-034d-407f-a7f6-0b1b0ddffb4f","Type":"ContainerStarted","Data":"ab24f4167c91aebcc7a1a6906986b601ccc65aa862c2f838678a64bd7d307548"} Mar 19 20:38:10 crc kubenswrapper[4799]: I0319 20:38:10.175495 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" event={"ID":"28b384cd-034d-407f-a7f6-0b1b0ddffb4f","Type":"ContainerStarted","Data":"9934870a10d5ecc2db5eca1178ad9ca98aa0029c5fc89dd6b53366e1ba4a2369"} Mar 19 20:38:11 crc kubenswrapper[4799]: I0319 20:38:11.119914 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:38:11 crc kubenswrapper[4799]: E0319 20:38:11.120203 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:38:23 crc kubenswrapper[4799]: I0319 20:38:23.141929 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:38:23 crc kubenswrapper[4799]: E0319 20:38:23.143973 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:38:36 crc kubenswrapper[4799]: I0319 20:38:36.117688 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:38:36 crc kubenswrapper[4799]: E0319 20:38:36.118992 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:38:47 crc kubenswrapper[4799]: I0319 20:38:47.116226 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:38:47 crc kubenswrapper[4799]: E0319 20:38:47.117230 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:38:58 crc kubenswrapper[4799]: I0319 20:38:58.707511 4799 generic.go:334] "Generic (PLEG): container finished" podID="28b384cd-034d-407f-a7f6-0b1b0ddffb4f" containerID="9934870a10d5ecc2db5eca1178ad9ca98aa0029c5fc89dd6b53366e1ba4a2369" exitCode=0 Mar 19 20:38:58 crc kubenswrapper[4799]: I0319 20:38:58.707607 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" event={"ID":"28b384cd-034d-407f-a7f6-0b1b0ddffb4f","Type":"ContainerDied","Data":"9934870a10d5ecc2db5eca1178ad9ca98aa0029c5fc89dd6b53366e1ba4a2369"} Mar 19 20:38:59 crc kubenswrapper[4799]: I0319 20:38:59.416089 4799 scope.go:117] "RemoveContainer" containerID="b13d06143d7a46250e7c9f42bd9ff09aecd14950703ed9c89945d81034a818d8" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.168220 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.186229 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-ssh-key-openstack-edpm-ipam\") pod \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.186641 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-nova-metadata-neutron-config-0\") pod \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.186677 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-metadata-combined-ca-bundle\") pod \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.186781 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-inventory\") pod \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.186822 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.186862 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24hxb\" (UniqueName: \"kubernetes.io/projected/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-kube-api-access-24hxb\") pod \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\" (UID: \"28b384cd-034d-407f-a7f6-0b1b0ddffb4f\") " Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.218279 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "28b384cd-034d-407f-a7f6-0b1b0ddffb4f" (UID: "28b384cd-034d-407f-a7f6-0b1b0ddffb4f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.218415 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-kube-api-access-24hxb" (OuterVolumeSpecName: "kube-api-access-24hxb") pod "28b384cd-034d-407f-a7f6-0b1b0ddffb4f" (UID: "28b384cd-034d-407f-a7f6-0b1b0ddffb4f"). InnerVolumeSpecName "kube-api-access-24hxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.231005 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "28b384cd-034d-407f-a7f6-0b1b0ddffb4f" (UID: "28b384cd-034d-407f-a7f6-0b1b0ddffb4f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.241196 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-inventory" (OuterVolumeSpecName: "inventory") pod "28b384cd-034d-407f-a7f6-0b1b0ddffb4f" (UID: "28b384cd-034d-407f-a7f6-0b1b0ddffb4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.245928 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "28b384cd-034d-407f-a7f6-0b1b0ddffb4f" (UID: "28b384cd-034d-407f-a7f6-0b1b0ddffb4f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.262256 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28b384cd-034d-407f-a7f6-0b1b0ddffb4f" (UID: "28b384cd-034d-407f-a7f6-0b1b0ddffb4f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.288684 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.288727 4799 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.288743 4799 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.288757 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.288771 4799 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.288783 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24hxb\" (UniqueName: \"kubernetes.io/projected/28b384cd-034d-407f-a7f6-0b1b0ddffb4f-kube-api-access-24hxb\") on node \"crc\" DevicePath \"\"" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.730633 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" event={"ID":"28b384cd-034d-407f-a7f6-0b1b0ddffb4f","Type":"ContainerDied","Data":"ab24f4167c91aebcc7a1a6906986b601ccc65aa862c2f838678a64bd7d307548"} Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.730688 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab24f4167c91aebcc7a1a6906986b601ccc65aa862c2f838678a64bd7d307548" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.730716 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.855078 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6"] Mar 19 20:39:00 crc kubenswrapper[4799]: E0319 20:39:00.855831 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b384cd-034d-407f-a7f6-0b1b0ddffb4f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.855867 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b384cd-034d-407f-a7f6-0b1b0ddffb4f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.856252 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b384cd-034d-407f-a7f6-0b1b0ddffb4f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.857343 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.861154 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.861630 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.863271 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.863648 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.866500 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6"] Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.870346 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.902573 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsbd\" (UniqueName: \"kubernetes.io/projected/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-kube-api-access-rtsbd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.902616 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.902652 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.902683 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:00 crc kubenswrapper[4799]: I0319 20:39:00.903001 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.004527 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.004684 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsbd\" (UniqueName: \"kubernetes.io/projected/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-kube-api-access-rtsbd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.004727 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.004755 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.004791 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.010509 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.011704 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.012051 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.012667 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.032753 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsbd\" (UniqueName: \"kubernetes.io/projected/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-kube-api-access-rtsbd\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.194793 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.618345 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6"] Mar 19 20:39:01 crc kubenswrapper[4799]: I0319 20:39:01.741000 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" event={"ID":"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0","Type":"ContainerStarted","Data":"bcf65e86a723dd445c94d624e97b4968c7b4f3cc02a079266c7746556fd564ba"} Mar 19 20:39:02 crc kubenswrapper[4799]: I0319 20:39:02.116311 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:39:02 crc kubenswrapper[4799]: E0319 20:39:02.116979 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:39:03 crc kubenswrapper[4799]: I0319 20:39:03.755424 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" event={"ID":"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0","Type":"ContainerStarted","Data":"c612ed1d54ace51e0f2e07a6ca5560a78ef821bc0ece12953ee71a784dddc061"} Mar 19 20:39:03 crc kubenswrapper[4799]: I0319 20:39:03.780496 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" podStartSLOduration=2.850821215 podStartE2EDuration="3.780469949s" podCreationTimestamp="2026-03-19 20:39:00 +0000 UTC" firstStartedPulling="2026-03-19 20:39:01.622642063 +0000 UTC m=+2019.228595135" lastFinishedPulling="2026-03-19 20:39:02.552290787 +0000 UTC m=+2020.158243869" observedRunningTime="2026-03-19 20:39:03.770923472 +0000 UTC m=+2021.376876604" watchObservedRunningTime="2026-03-19 20:39:03.780469949 +0000 UTC m=+2021.386423061" Mar 19 20:39:17 crc kubenswrapper[4799]: I0319 20:39:17.116926 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:39:17 crc kubenswrapper[4799]: E0319 20:39:17.118377 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:39:32 crc kubenswrapper[4799]: I0319 20:39:32.116802 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:39:32 crc kubenswrapper[4799]: E0319 20:39:32.117778 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:39:47 crc kubenswrapper[4799]: I0319 20:39:47.116096 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:39:47 crc kubenswrapper[4799]: E0319 20:39:47.117419 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.116444 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.174909 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565880-mj9lx"] Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.177261 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.180956 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.181013 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.181240 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.189315 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565880-mj9lx"] Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.291894 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57j6x\" (UniqueName: \"kubernetes.io/projected/18033e1d-e2e2-45d9-af5c-cf74cbb08471-kube-api-access-57j6x\") pod \"auto-csr-approver-29565880-mj9lx\" (UID: \"18033e1d-e2e2-45d9-af5c-cf74cbb08471\") " pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.394498 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57j6x\" (UniqueName: \"kubernetes.io/projected/18033e1d-e2e2-45d9-af5c-cf74cbb08471-kube-api-access-57j6x\") pod \"auto-csr-approver-29565880-mj9lx\" (UID: \"18033e1d-e2e2-45d9-af5c-cf74cbb08471\") " pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.403945 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"2740eb9aae6bdd4350798a29265d5bfc5c2dd48e02f1b9ff9360379372fe8e04"} Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.416481 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57j6x\" (UniqueName: \"kubernetes.io/projected/18033e1d-e2e2-45d9-af5c-cf74cbb08471-kube-api-access-57j6x\") pod \"auto-csr-approver-29565880-mj9lx\" (UID: \"18033e1d-e2e2-45d9-af5c-cf74cbb08471\") " pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:00 crc kubenswrapper[4799]: I0319 20:40:00.573864 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:01 crc kubenswrapper[4799]: I0319 20:40:01.091918 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565880-mj9lx"] Mar 19 20:40:01 crc kubenswrapper[4799]: I0319 20:40:01.414787 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" event={"ID":"18033e1d-e2e2-45d9-af5c-cf74cbb08471","Type":"ContainerStarted","Data":"429b5f8beb36ce3e3a67c697ad48f52f8caf9bfbccf005294b7049ff28aaf4e6"} Mar 19 20:40:03 crc kubenswrapper[4799]: I0319 20:40:03.435163 4799 generic.go:334] "Generic (PLEG): container finished" podID="18033e1d-e2e2-45d9-af5c-cf74cbb08471" containerID="ce7f276a5c90b63a961c7df42c5116a4e6ea781d30ace429263f8c3937097c68" exitCode=0 Mar 19 20:40:03 crc kubenswrapper[4799]: I0319 20:40:03.435263 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" event={"ID":"18033e1d-e2e2-45d9-af5c-cf74cbb08471","Type":"ContainerDied","Data":"ce7f276a5c90b63a961c7df42c5116a4e6ea781d30ace429263f8c3937097c68"} Mar 19 20:40:04 crc kubenswrapper[4799]: I0319 20:40:04.838018 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:04 crc kubenswrapper[4799]: I0319 20:40:04.996081 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57j6x\" (UniqueName: \"kubernetes.io/projected/18033e1d-e2e2-45d9-af5c-cf74cbb08471-kube-api-access-57j6x\") pod \"18033e1d-e2e2-45d9-af5c-cf74cbb08471\" (UID: \"18033e1d-e2e2-45d9-af5c-cf74cbb08471\") " Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.008713 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18033e1d-e2e2-45d9-af5c-cf74cbb08471-kube-api-access-57j6x" (OuterVolumeSpecName: "kube-api-access-57j6x") pod "18033e1d-e2e2-45d9-af5c-cf74cbb08471" (UID: "18033e1d-e2e2-45d9-af5c-cf74cbb08471"). InnerVolumeSpecName "kube-api-access-57j6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.098686 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57j6x\" (UniqueName: \"kubernetes.io/projected/18033e1d-e2e2-45d9-af5c-cf74cbb08471-kube-api-access-57j6x\") on node \"crc\" DevicePath \"\"" Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.476860 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" event={"ID":"18033e1d-e2e2-45d9-af5c-cf74cbb08471","Type":"ContainerDied","Data":"429b5f8beb36ce3e3a67c697ad48f52f8caf9bfbccf005294b7049ff28aaf4e6"} Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.477466 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="429b5f8beb36ce3e3a67c697ad48f52f8caf9bfbccf005294b7049ff28aaf4e6" Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.477497 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565880-mj9lx" Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.930083 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565874-v455m"] Mar 19 20:40:05 crc kubenswrapper[4799]: I0319 20:40:05.947887 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565874-v455m"] Mar 19 20:40:07 crc kubenswrapper[4799]: I0319 20:40:07.133617 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab17f72-6891-4733-a992-a4c84ff34b3e" path="/var/lib/kubelet/pods/2ab17f72-6891-4733-a992-a4c84ff34b3e/volumes" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.085558 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d6w77"] Mar 19 20:40:23 crc kubenswrapper[4799]: E0319 20:40:23.092881 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18033e1d-e2e2-45d9-af5c-cf74cbb08471" containerName="oc" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.092913 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="18033e1d-e2e2-45d9-af5c-cf74cbb08471" containerName="oc" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.093272 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="18033e1d-e2e2-45d9-af5c-cf74cbb08471" containerName="oc" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.095586 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.105624 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6w77"] Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.274759 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tcvx\" (UniqueName: \"kubernetes.io/projected/9e8a579e-0be8-4589-a655-6b43685e10c3-kube-api-access-6tcvx\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.275095 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-catalog-content\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.275208 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-utilities\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.377112 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tcvx\" (UniqueName: \"kubernetes.io/projected/9e8a579e-0be8-4589-a655-6b43685e10c3-kube-api-access-6tcvx\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.377231 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-catalog-content\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.377259 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-utilities\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.377726 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-utilities\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.378191 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-catalog-content\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.404447 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tcvx\" (UniqueName: \"kubernetes.io/projected/9e8a579e-0be8-4589-a655-6b43685e10c3-kube-api-access-6tcvx\") pod \"redhat-operators-d6w77\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.453105 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:23 crc kubenswrapper[4799]: I0319 20:40:23.944513 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6w77"] Mar 19 20:40:24 crc kubenswrapper[4799]: I0319 20:40:24.747220 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerID="53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1" exitCode=0 Mar 19 20:40:24 crc kubenswrapper[4799]: I0319 20:40:24.747264 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerDied","Data":"53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1"} Mar 19 20:40:24 crc kubenswrapper[4799]: I0319 20:40:24.747293 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerStarted","Data":"58f99981d41e1a1afc6ccdbe52cb307cca5fd8e72744fd32ee13560c0c38b302"} Mar 19 20:40:25 crc kubenswrapper[4799]: I0319 20:40:25.763020 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerStarted","Data":"b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c"} Mar 19 20:40:26 crc kubenswrapper[4799]: I0319 20:40:26.779576 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerID="b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c" exitCode=0 Mar 19 20:40:26 crc kubenswrapper[4799]: I0319 20:40:26.779698 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerDied","Data":"b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c"} Mar 19 20:40:27 crc kubenswrapper[4799]: I0319 20:40:27.792772 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerStarted","Data":"677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac"} Mar 19 20:40:27 crc kubenswrapper[4799]: I0319 20:40:27.816015 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d6w77" podStartSLOduration=2.324758702 podStartE2EDuration="4.81599747s" podCreationTimestamp="2026-03-19 20:40:23 +0000 UTC" firstStartedPulling="2026-03-19 20:40:24.749679752 +0000 UTC m=+2102.355632834" lastFinishedPulling="2026-03-19 20:40:27.24091849 +0000 UTC m=+2104.846871602" observedRunningTime="2026-03-19 20:40:27.81207794 +0000 UTC m=+2105.418031042" watchObservedRunningTime="2026-03-19 20:40:27.81599747 +0000 UTC m=+2105.421950542" Mar 19 20:40:33 crc kubenswrapper[4799]: I0319 20:40:33.454429 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:33 crc kubenswrapper[4799]: I0319 20:40:33.455020 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:34 crc kubenswrapper[4799]: I0319 20:40:34.516051 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d6w77" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:40:34 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:40:34 crc kubenswrapper[4799]: > Mar 19 20:40:43 crc kubenswrapper[4799]: I0319 20:40:43.528127 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:43 crc kubenswrapper[4799]: I0319 20:40:43.596932 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:43 crc kubenswrapper[4799]: I0319 20:40:43.775952 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6w77"] Mar 19 20:40:44 crc kubenswrapper[4799]: I0319 20:40:44.972236 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d6w77" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="registry-server" containerID="cri-o://677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac" gracePeriod=2 Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.466503 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.527960 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-catalog-content\") pod \"9e8a579e-0be8-4589-a655-6b43685e10c3\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.528087 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tcvx\" (UniqueName: \"kubernetes.io/projected/9e8a579e-0be8-4589-a655-6b43685e10c3-kube-api-access-6tcvx\") pod \"9e8a579e-0be8-4589-a655-6b43685e10c3\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.528138 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-utilities\") pod \"9e8a579e-0be8-4589-a655-6b43685e10c3\" (UID: \"9e8a579e-0be8-4589-a655-6b43685e10c3\") " Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.529031 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-utilities" (OuterVolumeSpecName: "utilities") pod "9e8a579e-0be8-4589-a655-6b43685e10c3" (UID: "9e8a579e-0be8-4589-a655-6b43685e10c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.533489 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8a579e-0be8-4589-a655-6b43685e10c3-kube-api-access-6tcvx" (OuterVolumeSpecName: "kube-api-access-6tcvx") pod "9e8a579e-0be8-4589-a655-6b43685e10c3" (UID: "9e8a579e-0be8-4589-a655-6b43685e10c3"). InnerVolumeSpecName "kube-api-access-6tcvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.630083 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tcvx\" (UniqueName: \"kubernetes.io/projected/9e8a579e-0be8-4589-a655-6b43685e10c3-kube-api-access-6tcvx\") on node \"crc\" DevicePath \"\"" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.630359 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.667559 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e8a579e-0be8-4589-a655-6b43685e10c3" (UID: "9e8a579e-0be8-4589-a655-6b43685e10c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.732928 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e8a579e-0be8-4589-a655-6b43685e10c3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.983514 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerID="677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac" exitCode=0 Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.983566 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6w77" Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.983573 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerDied","Data":"677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac"} Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.983646 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6w77" event={"ID":"9e8a579e-0be8-4589-a655-6b43685e10c3","Type":"ContainerDied","Data":"58f99981d41e1a1afc6ccdbe52cb307cca5fd8e72744fd32ee13560c0c38b302"} Mar 19 20:40:45 crc kubenswrapper[4799]: I0319 20:40:45.983676 4799 scope.go:117] "RemoveContainer" containerID="677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.008817 4799 scope.go:117] "RemoveContainer" containerID="b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.043365 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6w77"] Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.044101 4799 scope.go:117] "RemoveContainer" containerID="53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.061489 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d6w77"] Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.102658 4799 scope.go:117] "RemoveContainer" containerID="677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac" Mar 19 20:40:46 crc kubenswrapper[4799]: E0319 20:40:46.104255 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac\": container with ID starting with 677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac not found: ID does not exist" containerID="677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.104316 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac"} err="failed to get container status \"677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac\": rpc error: code = NotFound desc = could not find container \"677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac\": container with ID starting with 677c36426aebe5f7da31560519169c7bb008b6ff1876fca336903012255306ac not found: ID does not exist" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.104353 4799 scope.go:117] "RemoveContainer" containerID="b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c" Mar 19 20:40:46 crc kubenswrapper[4799]: E0319 20:40:46.104727 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c\": container with ID starting with b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c not found: ID does not exist" containerID="b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.104752 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c"} err="failed to get container status \"b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c\": rpc error: code = NotFound desc = could not find container \"b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c\": container with ID starting with b4fe8a5fef46e2d7b30e6a5b42a37e79bb1064d499da7f9883acf94efa8ca82c not found: ID does not exist" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.104766 4799 scope.go:117] "RemoveContainer" containerID="53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1" Mar 19 20:40:46 crc kubenswrapper[4799]: E0319 20:40:46.105072 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1\": container with ID starting with 53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1 not found: ID does not exist" containerID="53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1" Mar 19 20:40:46 crc kubenswrapper[4799]: I0319 20:40:46.105118 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1"} err="failed to get container status \"53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1\": rpc error: code = NotFound desc = could not find container \"53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1\": container with ID starting with 53ffc9d3f6686b2a896b5edbe48e8f151e1e54b4772d041f3d78f5053f6fe4c1 not found: ID does not exist" Mar 19 20:40:47 crc kubenswrapper[4799]: I0319 20:40:47.125965 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" path="/var/lib/kubelet/pods/9e8a579e-0be8-4589-a655-6b43685e10c3/volumes" Mar 19 20:40:59 crc kubenswrapper[4799]: I0319 20:40:59.523233 4799 scope.go:117] "RemoveContainer" containerID="078f9d022d16dd421af99a06229577e775f9009630de900152c89a985383b337" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.029999 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qcpg"] Mar 19 20:41:55 crc kubenswrapper[4799]: E0319 20:41:55.031322 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="extract-utilities" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.031339 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="extract-utilities" Mar 19 20:41:55 crc kubenswrapper[4799]: E0319 20:41:55.031352 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="extract-content" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.031362 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="extract-content" Mar 19 20:41:55 crc kubenswrapper[4799]: E0319 20:41:55.031426 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="registry-server" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.031435 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="registry-server" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.031655 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e8a579e-0be8-4589-a655-6b43685e10c3" containerName="registry-server" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.033193 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.047980 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qcpg"] Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.180053 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-utilities\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.180857 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-catalog-content\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.181132 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dmqp\" (UniqueName: \"kubernetes.io/projected/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-kube-api-access-7dmqp\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.283314 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dmqp\" (UniqueName: \"kubernetes.io/projected/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-kube-api-access-7dmqp\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.283854 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-utilities\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.284061 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-catalog-content\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.284331 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-utilities\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.284561 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-catalog-content\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.310131 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dmqp\" (UniqueName: \"kubernetes.io/projected/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-kube-api-access-7dmqp\") pod \"certified-operators-9qcpg\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.359058 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.684128 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qcpg"] Mar 19 20:41:55 crc kubenswrapper[4799]: W0319 20:41:55.693657 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6dadba5_8e05_4fb6_81de_aca68a7d5e66.slice/crio-c5e969327d3b2b5ed3f24c6c3961a3ce7f1ca31d1c18a1b14c9c1f62e878f132 WatchSource:0}: Error finding container c5e969327d3b2b5ed3f24c6c3961a3ce7f1ca31d1c18a1b14c9c1f62e878f132: Status 404 returned error can't find the container with id c5e969327d3b2b5ed3f24c6c3961a3ce7f1ca31d1c18a1b14c9c1f62e878f132 Mar 19 20:41:55 crc kubenswrapper[4799]: I0319 20:41:55.736440 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerStarted","Data":"c5e969327d3b2b5ed3f24c6c3961a3ce7f1ca31d1c18a1b14c9c1f62e878f132"} Mar 19 20:41:56 crc kubenswrapper[4799]: I0319 20:41:56.753158 4799 generic.go:334] "Generic (PLEG): container finished" podID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerID="f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d" exitCode=0 Mar 19 20:41:56 crc kubenswrapper[4799]: I0319 20:41:56.753320 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerDied","Data":"f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d"} Mar 19 20:41:56 crc kubenswrapper[4799]: I0319 20:41:56.758654 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:41:57 crc kubenswrapper[4799]: I0319 20:41:57.769714 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerStarted","Data":"5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd"} Mar 19 20:41:58 crc kubenswrapper[4799]: I0319 20:41:58.782052 4799 generic.go:334] "Generic (PLEG): container finished" podID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerID="5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd" exitCode=0 Mar 19 20:41:58 crc kubenswrapper[4799]: I0319 20:41:58.782096 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerDied","Data":"5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd"} Mar 19 20:41:59 crc kubenswrapper[4799]: I0319 20:41:59.796158 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerStarted","Data":"f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98"} Mar 19 20:41:59 crc kubenswrapper[4799]: I0319 20:41:59.827496 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qcpg" podStartSLOduration=2.369653202 podStartE2EDuration="4.827480534s" podCreationTimestamp="2026-03-19 20:41:55 +0000 UTC" firstStartedPulling="2026-03-19 20:41:56.758073265 +0000 UTC m=+2194.364026387" lastFinishedPulling="2026-03-19 20:41:59.215900637 +0000 UTC m=+2196.821853719" observedRunningTime="2026-03-19 20:41:59.82414262 +0000 UTC m=+2197.430095792" watchObservedRunningTime="2026-03-19 20:41:59.827480534 +0000 UTC m=+2197.433433606" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.154524 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565882-s8rnc"] Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.155961 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.158955 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.163729 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.165214 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565882-s8rnc"] Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.166156 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.318306 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ss88\" (UniqueName: \"kubernetes.io/projected/68221cef-725e-4911-a651-39fc18fc5715-kube-api-access-8ss88\") pod \"auto-csr-approver-29565882-s8rnc\" (UID: \"68221cef-725e-4911-a651-39fc18fc5715\") " pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.420092 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ss88\" (UniqueName: \"kubernetes.io/projected/68221cef-725e-4911-a651-39fc18fc5715-kube-api-access-8ss88\") pod \"auto-csr-approver-29565882-s8rnc\" (UID: \"68221cef-725e-4911-a651-39fc18fc5715\") " pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.441182 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ss88\" (UniqueName: \"kubernetes.io/projected/68221cef-725e-4911-a651-39fc18fc5715-kube-api-access-8ss88\") pod \"auto-csr-approver-29565882-s8rnc\" (UID: \"68221cef-725e-4911-a651-39fc18fc5715\") " pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.478673 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:00 crc kubenswrapper[4799]: I0319 20:42:00.962884 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565882-s8rnc"] Mar 19 20:42:00 crc kubenswrapper[4799]: W0319 20:42:00.966910 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68221cef_725e_4911_a651_39fc18fc5715.slice/crio-3316e87b1f2440dc3068b22af03d01b961a00abe1fbaa0a48cce73934739be3b WatchSource:0}: Error finding container 3316e87b1f2440dc3068b22af03d01b961a00abe1fbaa0a48cce73934739be3b: Status 404 returned error can't find the container with id 3316e87b1f2440dc3068b22af03d01b961a00abe1fbaa0a48cce73934739be3b Mar 19 20:42:01 crc kubenswrapper[4799]: I0319 20:42:01.815625 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" event={"ID":"68221cef-725e-4911-a651-39fc18fc5715","Type":"ContainerStarted","Data":"3316e87b1f2440dc3068b22af03d01b961a00abe1fbaa0a48cce73934739be3b"} Mar 19 20:42:02 crc kubenswrapper[4799]: I0319 20:42:02.827989 4799 generic.go:334] "Generic (PLEG): container finished" podID="68221cef-725e-4911-a651-39fc18fc5715" containerID="d56b1d204805ff7b97a5a58b2051903d9c796290f5371df56415d584fcff7094" exitCode=0 Mar 19 20:42:02 crc kubenswrapper[4799]: I0319 20:42:02.828049 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" event={"ID":"68221cef-725e-4911-a651-39fc18fc5715","Type":"ContainerDied","Data":"d56b1d204805ff7b97a5a58b2051903d9c796290f5371df56415d584fcff7094"} Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.200313 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.309555 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ss88\" (UniqueName: \"kubernetes.io/projected/68221cef-725e-4911-a651-39fc18fc5715-kube-api-access-8ss88\") pod \"68221cef-725e-4911-a651-39fc18fc5715\" (UID: \"68221cef-725e-4911-a651-39fc18fc5715\") " Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.317214 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68221cef-725e-4911-a651-39fc18fc5715-kube-api-access-8ss88" (OuterVolumeSpecName: "kube-api-access-8ss88") pod "68221cef-725e-4911-a651-39fc18fc5715" (UID: "68221cef-725e-4911-a651-39fc18fc5715"). InnerVolumeSpecName "kube-api-access-8ss88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.412112 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ss88\" (UniqueName: \"kubernetes.io/projected/68221cef-725e-4911-a651-39fc18fc5715-kube-api-access-8ss88\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.849752 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" event={"ID":"68221cef-725e-4911-a651-39fc18fc5715","Type":"ContainerDied","Data":"3316e87b1f2440dc3068b22af03d01b961a00abe1fbaa0a48cce73934739be3b"} Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.849818 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3316e87b1f2440dc3068b22af03d01b961a00abe1fbaa0a48cce73934739be3b" Mar 19 20:42:04 crc kubenswrapper[4799]: I0319 20:42:04.849873 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565882-s8rnc" Mar 19 20:42:05 crc kubenswrapper[4799]: I0319 20:42:05.294867 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565876-kcmfl"] Mar 19 20:42:05 crc kubenswrapper[4799]: I0319 20:42:05.310890 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565876-kcmfl"] Mar 19 20:42:05 crc kubenswrapper[4799]: I0319 20:42:05.360334 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:42:05 crc kubenswrapper[4799]: I0319 20:42:05.360437 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:42:05 crc kubenswrapper[4799]: I0319 20:42:05.423957 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:42:05 crc kubenswrapper[4799]: I0319 20:42:05.949200 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:42:07 crc kubenswrapper[4799]: I0319 20:42:07.136370 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d48006-ae79-434d-80fe-02d1d3884e33" path="/var/lib/kubelet/pods/32d48006-ae79-434d-80fe-02d1d3884e33/volumes" Mar 19 20:42:07 crc kubenswrapper[4799]: I0319 20:42:07.698025 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qcpg"] Mar 19 20:42:07 crc kubenswrapper[4799]: I0319 20:42:07.887948 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qcpg" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="registry-server" containerID="cri-o://f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98" gracePeriod=2 Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.425961 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.540523 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dmqp\" (UniqueName: \"kubernetes.io/projected/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-kube-api-access-7dmqp\") pod \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.540735 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-catalog-content\") pod \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.540858 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-utilities\") pod \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\" (UID: \"e6dadba5-8e05-4fb6-81de-aca68a7d5e66\") " Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.543585 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-utilities" (OuterVolumeSpecName: "utilities") pod "e6dadba5-8e05-4fb6-81de-aca68a7d5e66" (UID: "e6dadba5-8e05-4fb6-81de-aca68a7d5e66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.559989 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-kube-api-access-7dmqp" (OuterVolumeSpecName: "kube-api-access-7dmqp") pod "e6dadba5-8e05-4fb6-81de-aca68a7d5e66" (UID: "e6dadba5-8e05-4fb6-81de-aca68a7d5e66"). InnerVolumeSpecName "kube-api-access-7dmqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.644798 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.644871 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dmqp\" (UniqueName: \"kubernetes.io/projected/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-kube-api-access-7dmqp\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.818916 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6dadba5-8e05-4fb6-81de-aca68a7d5e66" (UID: "e6dadba5-8e05-4fb6-81de-aca68a7d5e66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.848644 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6dadba5-8e05-4fb6-81de-aca68a7d5e66-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.903510 4799 generic.go:334] "Generic (PLEG): container finished" podID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerID="f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98" exitCode=0 Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.903567 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerDied","Data":"f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98"} Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.903613 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qcpg" event={"ID":"e6dadba5-8e05-4fb6-81de-aca68a7d5e66","Type":"ContainerDied","Data":"c5e969327d3b2b5ed3f24c6c3961a3ce7f1ca31d1c18a1b14c9c1f62e878f132"} Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.903641 4799 scope.go:117] "RemoveContainer" containerID="f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.903647 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qcpg" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.931449 4799 scope.go:117] "RemoveContainer" containerID="5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd" Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.951645 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qcpg"] Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.959265 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qcpg"] Mar 19 20:42:08 crc kubenswrapper[4799]: I0319 20:42:08.990416 4799 scope.go:117] "RemoveContainer" containerID="f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.018726 4799 scope.go:117] "RemoveContainer" containerID="f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98" Mar 19 20:42:09 crc kubenswrapper[4799]: E0319 20:42:09.019284 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98\": container with ID starting with f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98 not found: ID does not exist" containerID="f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.019352 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98"} err="failed to get container status \"f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98\": rpc error: code = NotFound desc = could not find container \"f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98\": container with ID starting with f0762c22e7903ab51b59b1e80c59275cff0c24de10e64c932f27aef4e3ed8e98 not found: ID does not exist" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.019417 4799 scope.go:117] "RemoveContainer" containerID="5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd" Mar 19 20:42:09 crc kubenswrapper[4799]: E0319 20:42:09.020141 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd\": container with ID starting with 5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd not found: ID does not exist" containerID="5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.020183 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd"} err="failed to get container status \"5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd\": rpc error: code = NotFound desc = could not find container \"5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd\": container with ID starting with 5d705d62d4cd6b3925489072f1c492c9c2174578a01670bb013266aac88579fd not found: ID does not exist" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.020209 4799 scope.go:117] "RemoveContainer" containerID="f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d" Mar 19 20:42:09 crc kubenswrapper[4799]: E0319 20:42:09.020691 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d\": container with ID starting with f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d not found: ID does not exist" containerID="f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.020746 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d"} err="failed to get container status \"f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d\": rpc error: code = NotFound desc = could not find container \"f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d\": container with ID starting with f465242c8825b213f7eb75c179ad68fc76e74778e1cf4939eab2300309f1ab6d not found: ID does not exist" Mar 19 20:42:09 crc kubenswrapper[4799]: I0319 20:42:09.142355 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" path="/var/lib/kubelet/pods/e6dadba5-8e05-4fb6-81de-aca68a7d5e66/volumes" Mar 19 20:42:28 crc kubenswrapper[4799]: I0319 20:42:28.755596 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:42:28 crc kubenswrapper[4799]: I0319 20:42:28.756410 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:42:48 crc kubenswrapper[4799]: I0319 20:42:48.358543 4799 generic.go:334] "Generic (PLEG): container finished" podID="880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" containerID="c612ed1d54ace51e0f2e07a6ca5560a78ef821bc0ece12953ee71a784dddc061" exitCode=0 Mar 19 20:42:48 crc kubenswrapper[4799]: I0319 20:42:48.358649 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" event={"ID":"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0","Type":"ContainerDied","Data":"c612ed1d54ace51e0f2e07a6ca5560a78ef821bc0ece12953ee71a784dddc061"} Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.734747 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.828249 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-secret-0\") pod \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.828317 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-inventory\") pod \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.828351 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtsbd\" (UniqueName: \"kubernetes.io/projected/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-kube-api-access-rtsbd\") pod \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.828506 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-ssh-key-openstack-edpm-ipam\") pod \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.828542 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-combined-ca-bundle\") pod \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\" (UID: \"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0\") " Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.834848 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-kube-api-access-rtsbd" (OuterVolumeSpecName: "kube-api-access-rtsbd") pod "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" (UID: "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0"). InnerVolumeSpecName "kube-api-access-rtsbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.835118 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" (UID: "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.862027 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-inventory" (OuterVolumeSpecName: "inventory") pod "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" (UID: "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.863128 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" (UID: "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.863200 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" (UID: "880fa93c-771b-4896-9ab3-cb7d0a9ab3e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.930686 4799 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.930725 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.930736 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtsbd\" (UniqueName: \"kubernetes.io/projected/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-kube-api-access-rtsbd\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.930747 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:49 crc kubenswrapper[4799]: I0319 20:42:49.930757 4799 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/880fa93c-771b-4896-9ab3-cb7d0a9ab3e0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.376135 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" event={"ID":"880fa93c-771b-4896-9ab3-cb7d0a9ab3e0","Type":"ContainerDied","Data":"bcf65e86a723dd445c94d624e97b4968c7b4f3cc02a079266c7746556fd564ba"} Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.376175 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf65e86a723dd445c94d624e97b4968c7b4f3cc02a079266c7746556fd564ba" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.376255 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.477469 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx"] Mar 19 20:42:50 crc kubenswrapper[4799]: E0319 20:42:50.477838 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="extract-content" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.477856 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="extract-content" Mar 19 20:42:50 crc kubenswrapper[4799]: E0319 20:42:50.477877 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="registry-server" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.477884 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="registry-server" Mar 19 20:42:50 crc kubenswrapper[4799]: E0319 20:42:50.477913 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.477920 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 20:42:50 crc kubenswrapper[4799]: E0319 20:42:50.477927 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="extract-utilities" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.477934 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="extract-utilities" Mar 19 20:42:50 crc kubenswrapper[4799]: E0319 20:42:50.477953 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68221cef-725e-4911-a651-39fc18fc5715" containerName="oc" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.477960 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="68221cef-725e-4911-a651-39fc18fc5715" containerName="oc" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.478151 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="880fa93c-771b-4896-9ab3-cb7d0a9ab3e0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.478163 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6dadba5-8e05-4fb6-81de-aca68a7d5e66" containerName="registry-server" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.478192 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="68221cef-725e-4911-a651-39fc18fc5715" containerName="oc" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.478859 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.484016 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.484192 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.484299 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.484599 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.484705 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.484795 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.485671 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.495071 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx"] Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.641834 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.641930 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.641972 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.642002 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.642717 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.642791 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.642881 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9hq\" (UniqueName: \"kubernetes.io/projected/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-kube-api-access-nf9hq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.642923 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.642960 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.643086 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.643279 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746227 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746383 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746512 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746651 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746713 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746757 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746849 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.746927 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.747028 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.749119 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.747160 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.749432 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9hq\" (UniqueName: \"kubernetes.io/projected/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-kube-api-access-nf9hq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.750817 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.751498 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.751703 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.752437 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.753630 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.754001 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.755188 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.755851 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.756544 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.765958 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9hq\" (UniqueName: \"kubernetes.io/projected/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-kube-api-access-nf9hq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5x7fx\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:50 crc kubenswrapper[4799]: I0319 20:42:50.800743 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:42:51 crc kubenswrapper[4799]: I0319 20:42:51.416353 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx"] Mar 19 20:42:52 crc kubenswrapper[4799]: I0319 20:42:52.413890 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" event={"ID":"0317cfee-27aa-4ba1-9c6a-cf2b368c811b","Type":"ContainerStarted","Data":"14f7ef0dee55f6a4345d1cba6979ae65ef191e9a422aaa355c6a1b8c7e50dabe"} Mar 19 20:42:52 crc kubenswrapper[4799]: I0319 20:42:52.414459 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" event={"ID":"0317cfee-27aa-4ba1-9c6a-cf2b368c811b","Type":"ContainerStarted","Data":"4fc960415021e100c47461e312ab6778e2454be0777d824ac677c42650dc910c"} Mar 19 20:42:52 crc kubenswrapper[4799]: I0319 20:42:52.446600 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" podStartSLOduration=2.040988658 podStartE2EDuration="2.446562315s" podCreationTimestamp="2026-03-19 20:42:50 +0000 UTC" firstStartedPulling="2026-03-19 20:42:51.426824258 +0000 UTC m=+2249.032777350" lastFinishedPulling="2026-03-19 20:42:51.832397935 +0000 UTC m=+2249.438351007" observedRunningTime="2026-03-19 20:42:52.42942028 +0000 UTC m=+2250.035373382" watchObservedRunningTime="2026-03-19 20:42:52.446562315 +0000 UTC m=+2250.052515427" Mar 19 20:42:58 crc kubenswrapper[4799]: I0319 20:42:58.756613 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:42:58 crc kubenswrapper[4799]: I0319 20:42:58.757244 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:42:59 crc kubenswrapper[4799]: I0319 20:42:59.701264 4799 scope.go:117] "RemoveContainer" containerID="3b9c41610c105beb23934c157217ab50fea825905fc7c8485dd4ec3cb1b7dace" Mar 19 20:43:28 crc kubenswrapper[4799]: I0319 20:43:28.756125 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:43:28 crc kubenswrapper[4799]: I0319 20:43:28.756935 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:43:28 crc kubenswrapper[4799]: I0319 20:43:28.757003 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:43:28 crc kubenswrapper[4799]: I0319 20:43:28.758167 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2740eb9aae6bdd4350798a29265d5bfc5c2dd48e02f1b9ff9360379372fe8e04"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:43:28 crc kubenswrapper[4799]: I0319 20:43:28.758271 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://2740eb9aae6bdd4350798a29265d5bfc5c2dd48e02f1b9ff9360379372fe8e04" gracePeriod=600 Mar 19 20:43:29 crc kubenswrapper[4799]: I0319 20:43:29.851510 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="2740eb9aae6bdd4350798a29265d5bfc5c2dd48e02f1b9ff9360379372fe8e04" exitCode=0 Mar 19 20:43:29 crc kubenswrapper[4799]: I0319 20:43:29.851573 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"2740eb9aae6bdd4350798a29265d5bfc5c2dd48e02f1b9ff9360379372fe8e04"} Mar 19 20:43:29 crc kubenswrapper[4799]: I0319 20:43:29.852095 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79"} Mar 19 20:43:29 crc kubenswrapper[4799]: I0319 20:43:29.852116 4799 scope.go:117] "RemoveContainer" containerID="ef06da259fd5d9d349710af8292c9061589bf70a09539d7bad3502aaa1c415ac" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.161083 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565884-s2x8j"] Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.164445 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.167971 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.168233 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.169499 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.172154 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565884-s2x8j"] Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.252747 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqmm\" (UniqueName: \"kubernetes.io/projected/dcd4b2ca-7025-43b4-82b0-26fd43c60018-kube-api-access-qwqmm\") pod \"auto-csr-approver-29565884-s2x8j\" (UID: \"dcd4b2ca-7025-43b4-82b0-26fd43c60018\") " pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.354596 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqmm\" (UniqueName: \"kubernetes.io/projected/dcd4b2ca-7025-43b4-82b0-26fd43c60018-kube-api-access-qwqmm\") pod \"auto-csr-approver-29565884-s2x8j\" (UID: \"dcd4b2ca-7025-43b4-82b0-26fd43c60018\") " pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.382999 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqmm\" (UniqueName: \"kubernetes.io/projected/dcd4b2ca-7025-43b4-82b0-26fd43c60018-kube-api-access-qwqmm\") pod \"auto-csr-approver-29565884-s2x8j\" (UID: \"dcd4b2ca-7025-43b4-82b0-26fd43c60018\") " pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:00 crc kubenswrapper[4799]: I0319 20:44:00.499571 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:01 crc kubenswrapper[4799]: I0319 20:44:01.022228 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565884-s2x8j"] Mar 19 20:44:01 crc kubenswrapper[4799]: I0319 20:44:01.242505 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" event={"ID":"dcd4b2ca-7025-43b4-82b0-26fd43c60018","Type":"ContainerStarted","Data":"041f83abe8b8df32393e8ab90058dc57099113576d9fc58cc52e84caf7dbf7ef"} Mar 19 20:44:02 crc kubenswrapper[4799]: I0319 20:44:02.252686 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" event={"ID":"dcd4b2ca-7025-43b4-82b0-26fd43c60018","Type":"ContainerStarted","Data":"81c7afe2a81edf7da56e6c21fd337a44316e802c9342ad22ba8a6a7f8d273ab2"} Mar 19 20:44:02 crc kubenswrapper[4799]: I0319 20:44:02.286878 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" podStartSLOduration=1.459887258 podStartE2EDuration="2.286846107s" podCreationTimestamp="2026-03-19 20:44:00 +0000 UTC" firstStartedPulling="2026-03-19 20:44:01.0313205 +0000 UTC m=+2318.637273612" lastFinishedPulling="2026-03-19 20:44:01.858279349 +0000 UTC m=+2319.464232461" observedRunningTime="2026-03-19 20:44:02.265273193 +0000 UTC m=+2319.871226275" watchObservedRunningTime="2026-03-19 20:44:02.286846107 +0000 UTC m=+2319.892799219" Mar 19 20:44:03 crc kubenswrapper[4799]: I0319 20:44:03.268255 4799 generic.go:334] "Generic (PLEG): container finished" podID="dcd4b2ca-7025-43b4-82b0-26fd43c60018" containerID="81c7afe2a81edf7da56e6c21fd337a44316e802c9342ad22ba8a6a7f8d273ab2" exitCode=0 Mar 19 20:44:03 crc kubenswrapper[4799]: I0319 20:44:03.268321 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" event={"ID":"dcd4b2ca-7025-43b4-82b0-26fd43c60018","Type":"ContainerDied","Data":"81c7afe2a81edf7da56e6c21fd337a44316e802c9342ad22ba8a6a7f8d273ab2"} Mar 19 20:44:04 crc kubenswrapper[4799]: I0319 20:44:04.679074 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:04 crc kubenswrapper[4799]: I0319 20:44:04.778531 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwqmm\" (UniqueName: \"kubernetes.io/projected/dcd4b2ca-7025-43b4-82b0-26fd43c60018-kube-api-access-qwqmm\") pod \"dcd4b2ca-7025-43b4-82b0-26fd43c60018\" (UID: \"dcd4b2ca-7025-43b4-82b0-26fd43c60018\") " Mar 19 20:44:04 crc kubenswrapper[4799]: I0319 20:44:04.787575 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd4b2ca-7025-43b4-82b0-26fd43c60018-kube-api-access-qwqmm" (OuterVolumeSpecName: "kube-api-access-qwqmm") pod "dcd4b2ca-7025-43b4-82b0-26fd43c60018" (UID: "dcd4b2ca-7025-43b4-82b0-26fd43c60018"). InnerVolumeSpecName "kube-api-access-qwqmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:44:04 crc kubenswrapper[4799]: I0319 20:44:04.881947 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwqmm\" (UniqueName: \"kubernetes.io/projected/dcd4b2ca-7025-43b4-82b0-26fd43c60018-kube-api-access-qwqmm\") on node \"crc\" DevicePath \"\"" Mar 19 20:44:05 crc kubenswrapper[4799]: I0319 20:44:05.304555 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" event={"ID":"dcd4b2ca-7025-43b4-82b0-26fd43c60018","Type":"ContainerDied","Data":"041f83abe8b8df32393e8ab90058dc57099113576d9fc58cc52e84caf7dbf7ef"} Mar 19 20:44:05 crc kubenswrapper[4799]: I0319 20:44:05.304618 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="041f83abe8b8df32393e8ab90058dc57099113576d9fc58cc52e84caf7dbf7ef" Mar 19 20:44:05 crc kubenswrapper[4799]: I0319 20:44:05.304623 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565884-s2x8j" Mar 19 20:44:05 crc kubenswrapper[4799]: I0319 20:44:05.356908 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565878-lb6vk"] Mar 19 20:44:05 crc kubenswrapper[4799]: I0319 20:44:05.370002 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565878-lb6vk"] Mar 19 20:44:07 crc kubenswrapper[4799]: I0319 20:44:07.131791 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f7aa0e-6dd5-490e-ab10-29499246aeb6" path="/var/lib/kubelet/pods/99f7aa0e-6dd5-490e-ab10-29499246aeb6/volumes" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.351686 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nktbl"] Mar 19 20:44:18 crc kubenswrapper[4799]: E0319 20:44:18.353346 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd4b2ca-7025-43b4-82b0-26fd43c60018" containerName="oc" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.353376 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd4b2ca-7025-43b4-82b0-26fd43c60018" containerName="oc" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.353877 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd4b2ca-7025-43b4-82b0-26fd43c60018" containerName="oc" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.356550 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.389137 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktbl"] Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.489354 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-utilities\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.489429 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvbd\" (UniqueName: \"kubernetes.io/projected/0d34fde5-47a1-42a0-8492-3bc55f3fd587-kube-api-access-prvbd\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.489474 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-catalog-content\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.592567 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-utilities\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.592648 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prvbd\" (UniqueName: \"kubernetes.io/projected/0d34fde5-47a1-42a0-8492-3bc55f3fd587-kube-api-access-prvbd\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.592713 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-catalog-content\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.593010 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-utilities\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.593540 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-catalog-content\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.620942 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvbd\" (UniqueName: \"kubernetes.io/projected/0d34fde5-47a1-42a0-8492-3bc55f3fd587-kube-api-access-prvbd\") pod \"redhat-marketplace-nktbl\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:18 crc kubenswrapper[4799]: I0319 20:44:18.687426 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:19 crc kubenswrapper[4799]: I0319 20:44:19.168105 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktbl"] Mar 19 20:44:19 crc kubenswrapper[4799]: I0319 20:44:19.464394 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerID="c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8" exitCode=0 Mar 19 20:44:19 crc kubenswrapper[4799]: I0319 20:44:19.464434 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktbl" event={"ID":"0d34fde5-47a1-42a0-8492-3bc55f3fd587","Type":"ContainerDied","Data":"c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8"} Mar 19 20:44:19 crc kubenswrapper[4799]: I0319 20:44:19.464848 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktbl" event={"ID":"0d34fde5-47a1-42a0-8492-3bc55f3fd587","Type":"ContainerStarted","Data":"dd5b8f99e5d1cb9ea75669d4512be9a155f5feda6fcd0d5de5d8f5f7142cbda3"} Mar 19 20:44:21 crc kubenswrapper[4799]: I0319 20:44:21.491933 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerID="90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7" exitCode=0 Mar 19 20:44:21 crc kubenswrapper[4799]: I0319 20:44:21.492523 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktbl" event={"ID":"0d34fde5-47a1-42a0-8492-3bc55f3fd587","Type":"ContainerDied","Data":"90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7"} Mar 19 20:44:22 crc kubenswrapper[4799]: I0319 20:44:22.506886 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktbl" event={"ID":"0d34fde5-47a1-42a0-8492-3bc55f3fd587","Type":"ContainerStarted","Data":"af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91"} Mar 19 20:44:22 crc kubenswrapper[4799]: I0319 20:44:22.531410 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nktbl" podStartSLOduration=2.057315962 podStartE2EDuration="4.531269231s" podCreationTimestamp="2026-03-19 20:44:18 +0000 UTC" firstStartedPulling="2026-03-19 20:44:19.466865405 +0000 UTC m=+2337.072818477" lastFinishedPulling="2026-03-19 20:44:21.940818664 +0000 UTC m=+2339.546771746" observedRunningTime="2026-03-19 20:44:22.524405255 +0000 UTC m=+2340.130358327" watchObservedRunningTime="2026-03-19 20:44:22.531269231 +0000 UTC m=+2340.137222303" Mar 19 20:44:28 crc kubenswrapper[4799]: I0319 20:44:28.688219 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:28 crc kubenswrapper[4799]: I0319 20:44:28.688916 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:28 crc kubenswrapper[4799]: I0319 20:44:28.746755 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:29 crc kubenswrapper[4799]: I0319 20:44:29.661824 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:29 crc kubenswrapper[4799]: I0319 20:44:29.738797 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktbl"] Mar 19 20:44:31 crc kubenswrapper[4799]: I0319 20:44:31.610633 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nktbl" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="registry-server" containerID="cri-o://af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91" gracePeriod=2 Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.110484 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.256650 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-catalog-content\") pod \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.256781 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prvbd\" (UniqueName: \"kubernetes.io/projected/0d34fde5-47a1-42a0-8492-3bc55f3fd587-kube-api-access-prvbd\") pod \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.256859 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-utilities\") pod \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\" (UID: \"0d34fde5-47a1-42a0-8492-3bc55f3fd587\") " Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.258489 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-utilities" (OuterVolumeSpecName: "utilities") pod "0d34fde5-47a1-42a0-8492-3bc55f3fd587" (UID: "0d34fde5-47a1-42a0-8492-3bc55f3fd587"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.265144 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d34fde5-47a1-42a0-8492-3bc55f3fd587-kube-api-access-prvbd" (OuterVolumeSpecName: "kube-api-access-prvbd") pod "0d34fde5-47a1-42a0-8492-3bc55f3fd587" (UID: "0d34fde5-47a1-42a0-8492-3bc55f3fd587"). InnerVolumeSpecName "kube-api-access-prvbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.293980 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d34fde5-47a1-42a0-8492-3bc55f3fd587" (UID: "0d34fde5-47a1-42a0-8492-3bc55f3fd587"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.358616 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.358652 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prvbd\" (UniqueName: \"kubernetes.io/projected/0d34fde5-47a1-42a0-8492-3bc55f3fd587-kube-api-access-prvbd\") on node \"crc\" DevicePath \"\"" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.358665 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d34fde5-47a1-42a0-8492-3bc55f3fd587-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.626412 4799 generic.go:334] "Generic (PLEG): container finished" podID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerID="af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91" exitCode=0 Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.626474 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktbl" event={"ID":"0d34fde5-47a1-42a0-8492-3bc55f3fd587","Type":"ContainerDied","Data":"af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91"} Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.626516 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nktbl" event={"ID":"0d34fde5-47a1-42a0-8492-3bc55f3fd587","Type":"ContainerDied","Data":"dd5b8f99e5d1cb9ea75669d4512be9a155f5feda6fcd0d5de5d8f5f7142cbda3"} Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.626547 4799 scope.go:117] "RemoveContainer" containerID="af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.626761 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nktbl" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.665309 4799 scope.go:117] "RemoveContainer" containerID="90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.691071 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktbl"] Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.703236 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nktbl"] Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.707786 4799 scope.go:117] "RemoveContainer" containerID="c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.764760 4799 scope.go:117] "RemoveContainer" containerID="af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91" Mar 19 20:44:32 crc kubenswrapper[4799]: E0319 20:44:32.766789 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91\": container with ID starting with af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91 not found: ID does not exist" containerID="af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.766847 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91"} err="failed to get container status \"af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91\": rpc error: code = NotFound desc = could not find container \"af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91\": container with ID starting with af0379cc9c7d28e314447feb51acebbe953ef36245dd255ddc45aea95f902c91 not found: ID does not exist" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.766878 4799 scope.go:117] "RemoveContainer" containerID="90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7" Mar 19 20:44:32 crc kubenswrapper[4799]: E0319 20:44:32.767281 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7\": container with ID starting with 90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7 not found: ID does not exist" containerID="90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.767308 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7"} err="failed to get container status \"90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7\": rpc error: code = NotFound desc = could not find container \"90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7\": container with ID starting with 90351cfa726ab73254b7162967c5edba7101805a371048eb50c6b7310879e7c7 not found: ID does not exist" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.767325 4799 scope.go:117] "RemoveContainer" containerID="c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8" Mar 19 20:44:32 crc kubenswrapper[4799]: E0319 20:44:32.767755 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8\": container with ID starting with c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8 not found: ID does not exist" containerID="c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8" Mar 19 20:44:32 crc kubenswrapper[4799]: I0319 20:44:32.767817 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8"} err="failed to get container status \"c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8\": rpc error: code = NotFound desc = could not find container \"c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8\": container with ID starting with c328d4cf8364ce81c4265b9da11cc5dd0df6e6399febfe859b5e9587be7114e8 not found: ID does not exist" Mar 19 20:44:33 crc kubenswrapper[4799]: I0319 20:44:33.140366 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" path="/var/lib/kubelet/pods/0d34fde5-47a1-42a0-8492-3bc55f3fd587/volumes" Mar 19 20:44:59 crc kubenswrapper[4799]: I0319 20:44:59.846499 4799 scope.go:117] "RemoveContainer" containerID="87893d72c37db67468a4e9258e2ac164f1a0bdbe300405120bd9fea49bec9449" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.151779 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg"] Mar 19 20:45:00 crc kubenswrapper[4799]: E0319 20:45:00.152566 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="extract-utilities" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.152590 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="extract-utilities" Mar 19 20:45:00 crc kubenswrapper[4799]: E0319 20:45:00.152616 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="extract-content" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.152625 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="extract-content" Mar 19 20:45:00 crc kubenswrapper[4799]: E0319 20:45:00.152664 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="registry-server" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.152672 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="registry-server" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.152901 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d34fde5-47a1-42a0-8492-3bc55f3fd587" containerName="registry-server" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.153701 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.155913 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.156352 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.166539 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg"] Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.324214 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17472c0c-5784-426d-813c-ed7552f35a26-config-volume\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.324853 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9jgw\" (UniqueName: \"kubernetes.io/projected/17472c0c-5784-426d-813c-ed7552f35a26-kube-api-access-d9jgw\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.324997 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17472c0c-5784-426d-813c-ed7552f35a26-secret-volume\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.426963 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9jgw\" (UniqueName: \"kubernetes.io/projected/17472c0c-5784-426d-813c-ed7552f35a26-kube-api-access-d9jgw\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.427018 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17472c0c-5784-426d-813c-ed7552f35a26-secret-volume\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.427093 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17472c0c-5784-426d-813c-ed7552f35a26-config-volume\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.428136 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17472c0c-5784-426d-813c-ed7552f35a26-config-volume\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.438562 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17472c0c-5784-426d-813c-ed7552f35a26-secret-volume\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.457292 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9jgw\" (UniqueName: \"kubernetes.io/projected/17472c0c-5784-426d-813c-ed7552f35a26-kube-api-access-d9jgw\") pod \"collect-profiles-29565885-ff8lg\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.477792 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:00 crc kubenswrapper[4799]: I0319 20:45:00.996630 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg"] Mar 19 20:45:01 crc kubenswrapper[4799]: I0319 20:45:01.986610 4799 generic.go:334] "Generic (PLEG): container finished" podID="17472c0c-5784-426d-813c-ed7552f35a26" containerID="ec7aaa65db71f8a552401c350132ba606f8a017068e6d3b666e7da3d010847eb" exitCode=0 Mar 19 20:45:01 crc kubenswrapper[4799]: I0319 20:45:01.986760 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" event={"ID":"17472c0c-5784-426d-813c-ed7552f35a26","Type":"ContainerDied","Data":"ec7aaa65db71f8a552401c350132ba606f8a017068e6d3b666e7da3d010847eb"} Mar 19 20:45:01 crc kubenswrapper[4799]: I0319 20:45:01.987057 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" event={"ID":"17472c0c-5784-426d-813c-ed7552f35a26","Type":"ContainerStarted","Data":"5d66801961f849e1e6ccb1d2c606c2989b08f886398fe5faca0008bda842d86b"} Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.435935 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.609159 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17472c0c-5784-426d-813c-ed7552f35a26-config-volume\") pod \"17472c0c-5784-426d-813c-ed7552f35a26\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.609715 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9jgw\" (UniqueName: \"kubernetes.io/projected/17472c0c-5784-426d-813c-ed7552f35a26-kube-api-access-d9jgw\") pod \"17472c0c-5784-426d-813c-ed7552f35a26\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.609807 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17472c0c-5784-426d-813c-ed7552f35a26-secret-volume\") pod \"17472c0c-5784-426d-813c-ed7552f35a26\" (UID: \"17472c0c-5784-426d-813c-ed7552f35a26\") " Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.610369 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17472c0c-5784-426d-813c-ed7552f35a26-config-volume" (OuterVolumeSpecName: "config-volume") pod "17472c0c-5784-426d-813c-ed7552f35a26" (UID: "17472c0c-5784-426d-813c-ed7552f35a26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.610885 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17472c0c-5784-426d-813c-ed7552f35a26-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.616775 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17472c0c-5784-426d-813c-ed7552f35a26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "17472c0c-5784-426d-813c-ed7552f35a26" (UID: "17472c0c-5784-426d-813c-ed7552f35a26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.623666 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17472c0c-5784-426d-813c-ed7552f35a26-kube-api-access-d9jgw" (OuterVolumeSpecName: "kube-api-access-d9jgw") pod "17472c0c-5784-426d-813c-ed7552f35a26" (UID: "17472c0c-5784-426d-813c-ed7552f35a26"). InnerVolumeSpecName "kube-api-access-d9jgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.712847 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9jgw\" (UniqueName: \"kubernetes.io/projected/17472c0c-5784-426d-813c-ed7552f35a26-kube-api-access-d9jgw\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:03 crc kubenswrapper[4799]: I0319 20:45:03.712889 4799 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/17472c0c-5784-426d-813c-ed7552f35a26-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:04 crc kubenswrapper[4799]: I0319 20:45:04.009267 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" event={"ID":"17472c0c-5784-426d-813c-ed7552f35a26","Type":"ContainerDied","Data":"5d66801961f849e1e6ccb1d2c606c2989b08f886398fe5faca0008bda842d86b"} Mar 19 20:45:04 crc kubenswrapper[4799]: I0319 20:45:04.009308 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d66801961f849e1e6ccb1d2c606c2989b08f886398fe5faca0008bda842d86b" Mar 19 20:45:04 crc kubenswrapper[4799]: I0319 20:45:04.009310 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565885-ff8lg" Mar 19 20:45:04 crc kubenswrapper[4799]: I0319 20:45:04.561371 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk"] Mar 19 20:45:04 crc kubenswrapper[4799]: I0319 20:45:04.568725 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-27mqk"] Mar 19 20:45:05 crc kubenswrapper[4799]: I0319 20:45:05.143664 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8720bb6-e6ea-43b3-a750-d0b7c1221266" path="/var/lib/kubelet/pods/f8720bb6-e6ea-43b3-a750-d0b7c1221266/volumes" Mar 19 20:45:18 crc kubenswrapper[4799]: I0319 20:45:18.176231 4799 generic.go:334] "Generic (PLEG): container finished" podID="0317cfee-27aa-4ba1-9c6a-cf2b368c811b" containerID="14f7ef0dee55f6a4345d1cba6979ae65ef191e9a422aaa355c6a1b8c7e50dabe" exitCode=0 Mar 19 20:45:18 crc kubenswrapper[4799]: I0319 20:45:18.176305 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" event={"ID":"0317cfee-27aa-4ba1-9c6a-cf2b368c811b","Type":"ContainerDied","Data":"14f7ef0dee55f6a4345d1cba6979ae65ef191e9a422aaa355c6a1b8c7e50dabe"} Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.725810 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.773563 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-0\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.773630 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-ssh-key-openstack-edpm-ipam\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.773669 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-3\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.773711 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf9hq\" (UniqueName: \"kubernetes.io/projected/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-kube-api-access-nf9hq\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.773755 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-1\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.773966 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-combined-ca-bundle\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.774030 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-0\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.774064 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-extra-config-0\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.774169 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-2\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.774215 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-1\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.774269 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-inventory\") pod \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\" (UID: \"0317cfee-27aa-4ba1-9c6a-cf2b368c811b\") " Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.787714 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.795482 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-kube-api-access-nf9hq" (OuterVolumeSpecName: "kube-api-access-nf9hq") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "kube-api-access-nf9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.817091 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.818195 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.821674 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.823599 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.832327 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.839259 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-inventory" (OuterVolumeSpecName: "inventory") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.839667 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.844142 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.853688 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0317cfee-27aa-4ba1-9c6a-cf2b368c811b" (UID: "0317cfee-27aa-4ba1-9c6a-cf2b368c811b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.877979 4799 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878018 4799 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878029 4799 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878076 4799 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878088 4799 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878098 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878108 4799 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878117 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878125 4799 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878135 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf9hq\" (UniqueName: \"kubernetes.io/projected/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-kube-api-access-nf9hq\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:19 crc kubenswrapper[4799]: I0319 20:45:19.878148 4799 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0317cfee-27aa-4ba1-9c6a-cf2b368c811b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.217114 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" event={"ID":"0317cfee-27aa-4ba1-9c6a-cf2b368c811b","Type":"ContainerDied","Data":"4fc960415021e100c47461e312ab6778e2454be0777d824ac677c42650dc910c"} Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.217453 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc960415021e100c47461e312ab6778e2454be0777d824ac677c42650dc910c" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.217195 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5x7fx" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.357184 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk"] Mar 19 20:45:20 crc kubenswrapper[4799]: E0319 20:45:20.357978 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17472c0c-5784-426d-813c-ed7552f35a26" containerName="collect-profiles" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.358013 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="17472c0c-5784-426d-813c-ed7552f35a26" containerName="collect-profiles" Mar 19 20:45:20 crc kubenswrapper[4799]: E0319 20:45:20.358090 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0317cfee-27aa-4ba1-9c6a-cf2b368c811b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.358109 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0317cfee-27aa-4ba1-9c6a-cf2b368c811b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.358454 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0317cfee-27aa-4ba1-9c6a-cf2b368c811b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.358494 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="17472c0c-5784-426d-813c-ed7552f35a26" containerName="collect-profiles" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.359732 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.361732 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.362624 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.362707 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.363128 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.369346 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-45lfx" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.370139 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk"] Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.494957 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.495016 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.495171 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.495329 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.495401 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.495529 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.495689 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhn7\" (UniqueName: \"kubernetes.io/projected/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-kube-api-access-tmhn7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.597806 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.597946 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhn7\" (UniqueName: \"kubernetes.io/projected/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-kube-api-access-tmhn7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.598007 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.598051 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.598199 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.598268 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.598309 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.602331 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.603421 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.604070 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.604761 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.605867 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.606129 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.618627 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhn7\" (UniqueName: \"kubernetes.io/projected/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-kube-api-access-tmhn7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-j27sk\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:20 crc kubenswrapper[4799]: I0319 20:45:20.692230 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:45:21 crc kubenswrapper[4799]: I0319 20:45:21.262334 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk"] Mar 19 20:45:21 crc kubenswrapper[4799]: W0319 20:45:21.265840 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd77bbb41_e2be_4c81_b266_92ca0b6e8b44.slice/crio-e77908521d6064cbbbc9398f22c1c430974704bb07aab85810225e7425e32004 WatchSource:0}: Error finding container e77908521d6064cbbbc9398f22c1c430974704bb07aab85810225e7425e32004: Status 404 returned error can't find the container with id e77908521d6064cbbbc9398f22c1c430974704bb07aab85810225e7425e32004 Mar 19 20:45:22 crc kubenswrapper[4799]: I0319 20:45:22.243376 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" event={"ID":"d77bbb41-e2be-4c81-b266-92ca0b6e8b44","Type":"ContainerStarted","Data":"e776d478e4800b944c9e977cbf3f152491907bc5d62d5b0c23a8a578e0f48568"} Mar 19 20:45:22 crc kubenswrapper[4799]: I0319 20:45:22.243933 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" event={"ID":"d77bbb41-e2be-4c81-b266-92ca0b6e8b44","Type":"ContainerStarted","Data":"e77908521d6064cbbbc9398f22c1c430974704bb07aab85810225e7425e32004"} Mar 19 20:45:22 crc kubenswrapper[4799]: I0319 20:45:22.270942 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" podStartSLOduration=1.830589209 podStartE2EDuration="2.270909093s" podCreationTimestamp="2026-03-19 20:45:20 +0000 UTC" firstStartedPulling="2026-03-19 20:45:21.269362254 +0000 UTC m=+2398.875315366" lastFinishedPulling="2026-03-19 20:45:21.709682148 +0000 UTC m=+2399.315635250" observedRunningTime="2026-03-19 20:45:22.264334996 +0000 UTC m=+2399.870288158" watchObservedRunningTime="2026-03-19 20:45:22.270909093 +0000 UTC m=+2399.876862215" Mar 19 20:45:58 crc kubenswrapper[4799]: I0319 20:45:58.755466 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:45:58 crc kubenswrapper[4799]: I0319 20:45:58.756231 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:45:59 crc kubenswrapper[4799]: I0319 20:45:59.951988 4799 scope.go:117] "RemoveContainer" containerID="b9e9a56806475d735ebce84ef1fa1d7783c9dcff70f00bcfab9b6d2c920fe173" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.162855 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565886-hlhm2"] Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.165075 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.167596 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.171704 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.171733 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.185882 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565886-hlhm2"] Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.254851 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862q2\" (UniqueName: \"kubernetes.io/projected/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7-kube-api-access-862q2\") pod \"auto-csr-approver-29565886-hlhm2\" (UID: \"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7\") " pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.357754 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862q2\" (UniqueName: \"kubernetes.io/projected/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7-kube-api-access-862q2\") pod \"auto-csr-approver-29565886-hlhm2\" (UID: \"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7\") " pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.381500 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862q2\" (UniqueName: \"kubernetes.io/projected/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7-kube-api-access-862q2\") pod \"auto-csr-approver-29565886-hlhm2\" (UID: \"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7\") " pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.493310 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:00 crc kubenswrapper[4799]: I0319 20:46:00.994589 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565886-hlhm2"] Mar 19 20:46:01 crc kubenswrapper[4799]: I0319 20:46:01.695290 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" event={"ID":"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7","Type":"ContainerStarted","Data":"9f10c7c4473665d921ee2c433734946f7da9f406b2fb5a6bc17b7731b05f5b37"} Mar 19 20:46:02 crc kubenswrapper[4799]: I0319 20:46:02.725001 4799 generic.go:334] "Generic (PLEG): container finished" podID="5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7" containerID="8ec7d3e47f37efbe7a74979f4d5a811ccdb5df4462ecfe7f8fb3d637bc923926" exitCode=0 Mar 19 20:46:02 crc kubenswrapper[4799]: I0319 20:46:02.725408 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" event={"ID":"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7","Type":"ContainerDied","Data":"8ec7d3e47f37efbe7a74979f4d5a811ccdb5df4462ecfe7f8fb3d637bc923926"} Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.128023 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.254663 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-862q2\" (UniqueName: \"kubernetes.io/projected/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7-kube-api-access-862q2\") pod \"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7\" (UID: \"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7\") " Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.269577 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7-kube-api-access-862q2" (OuterVolumeSpecName: "kube-api-access-862q2") pod "5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7" (UID: "5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7"). InnerVolumeSpecName "kube-api-access-862q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.356720 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-862q2\" (UniqueName: \"kubernetes.io/projected/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7-kube-api-access-862q2\") on node \"crc\" DevicePath \"\"" Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.744602 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" event={"ID":"5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7","Type":"ContainerDied","Data":"9f10c7c4473665d921ee2c433734946f7da9f406b2fb5a6bc17b7731b05f5b37"} Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.744675 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f10c7c4473665d921ee2c433734946f7da9f406b2fb5a6bc17b7731b05f5b37" Mar 19 20:46:04 crc kubenswrapper[4799]: I0319 20:46:04.744628 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565886-hlhm2" Mar 19 20:46:05 crc kubenswrapper[4799]: I0319 20:46:05.200314 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565880-mj9lx"] Mar 19 20:46:05 crc kubenswrapper[4799]: I0319 20:46:05.208132 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565880-mj9lx"] Mar 19 20:46:07 crc kubenswrapper[4799]: I0319 20:46:07.135160 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18033e1d-e2e2-45d9-af5c-cf74cbb08471" path="/var/lib/kubelet/pods/18033e1d-e2e2-45d9-af5c-cf74cbb08471/volumes" Mar 19 20:46:28 crc kubenswrapper[4799]: I0319 20:46:28.756771 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:46:28 crc kubenswrapper[4799]: I0319 20:46:28.757566 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.064081 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nfbnk"] Mar 19 20:46:41 crc kubenswrapper[4799]: E0319 20:46:41.065912 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7" containerName="oc" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.065937 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7" containerName="oc" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.066249 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7" containerName="oc" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.068350 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.104088 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfbnk"] Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.134485 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-utilities\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.134586 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76ztf\" (UniqueName: \"kubernetes.io/projected/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-kube-api-access-76ztf\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.134721 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-catalog-content\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.236499 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76ztf\" (UniqueName: \"kubernetes.io/projected/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-kube-api-access-76ztf\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.236638 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-catalog-content\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.236742 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-utilities\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.237300 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-utilities\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.237937 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-catalog-content\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.261896 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76ztf\" (UniqueName: \"kubernetes.io/projected/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-kube-api-access-76ztf\") pod \"community-operators-nfbnk\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:41 crc kubenswrapper[4799]: I0319 20:46:41.409892 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:42 crc kubenswrapper[4799]: I0319 20:46:42.070978 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nfbnk"] Mar 19 20:46:42 crc kubenswrapper[4799]: I0319 20:46:42.185724 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerStarted","Data":"7084030d84bd3fe4606957e78dd31f20dba7326a7d12605233fd5f00feaf8e80"} Mar 19 20:46:43 crc kubenswrapper[4799]: I0319 20:46:43.195143 4799 generic.go:334] "Generic (PLEG): container finished" podID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerID="99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99" exitCode=0 Mar 19 20:46:43 crc kubenswrapper[4799]: I0319 20:46:43.195186 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerDied","Data":"99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99"} Mar 19 20:46:44 crc kubenswrapper[4799]: I0319 20:46:44.207429 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerStarted","Data":"47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b"} Mar 19 20:46:45 crc kubenswrapper[4799]: I0319 20:46:45.223067 4799 generic.go:334] "Generic (PLEG): container finished" podID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerID="47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b" exitCode=0 Mar 19 20:46:45 crc kubenswrapper[4799]: I0319 20:46:45.223420 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerDied","Data":"47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b"} Mar 19 20:46:46 crc kubenswrapper[4799]: I0319 20:46:46.238147 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerStarted","Data":"c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd"} Mar 19 20:46:46 crc kubenswrapper[4799]: I0319 20:46:46.259020 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nfbnk" podStartSLOduration=2.700628159 podStartE2EDuration="5.259004452s" podCreationTimestamp="2026-03-19 20:46:41 +0000 UTC" firstStartedPulling="2026-03-19 20:46:43.197565869 +0000 UTC m=+2480.803518961" lastFinishedPulling="2026-03-19 20:46:45.755942172 +0000 UTC m=+2483.361895254" observedRunningTime="2026-03-19 20:46:46.255048989 +0000 UTC m=+2483.861002061" watchObservedRunningTime="2026-03-19 20:46:46.259004452 +0000 UTC m=+2483.864957524" Mar 19 20:46:51 crc kubenswrapper[4799]: I0319 20:46:51.410885 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:51 crc kubenswrapper[4799]: I0319 20:46:51.411901 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:51 crc kubenswrapper[4799]: I0319 20:46:51.501645 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:52 crc kubenswrapper[4799]: I0319 20:46:52.369476 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:52 crc kubenswrapper[4799]: I0319 20:46:52.414299 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfbnk"] Mar 19 20:46:54 crc kubenswrapper[4799]: I0319 20:46:54.332505 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nfbnk" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="registry-server" containerID="cri-o://c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd" gracePeriod=2 Mar 19 20:46:54 crc kubenswrapper[4799]: I0319 20:46:54.929340 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.034471 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76ztf\" (UniqueName: \"kubernetes.io/projected/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-kube-api-access-76ztf\") pod \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.034573 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-catalog-content\") pod \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.034759 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-utilities\") pod \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\" (UID: \"c18c1638-a9d7-44ac-9c69-eb7d9d092c36\") " Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.038497 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-utilities" (OuterVolumeSpecName: "utilities") pod "c18c1638-a9d7-44ac-9c69-eb7d9d092c36" (UID: "c18c1638-a9d7-44ac-9c69-eb7d9d092c36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.041822 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-kube-api-access-76ztf" (OuterVolumeSpecName: "kube-api-access-76ztf") pod "c18c1638-a9d7-44ac-9c69-eb7d9d092c36" (UID: "c18c1638-a9d7-44ac-9c69-eb7d9d092c36"). InnerVolumeSpecName "kube-api-access-76ztf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.095970 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c18c1638-a9d7-44ac-9c69-eb7d9d092c36" (UID: "c18c1638-a9d7-44ac-9c69-eb7d9d092c36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.136887 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.136921 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76ztf\" (UniqueName: \"kubernetes.io/projected/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-kube-api-access-76ztf\") on node \"crc\" DevicePath \"\"" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.136934 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18c1638-a9d7-44ac-9c69-eb7d9d092c36-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.347791 4799 generic.go:334] "Generic (PLEG): container finished" podID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerID="c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd" exitCode=0 Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.347863 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerDied","Data":"c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd"} Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.347872 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nfbnk" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.347922 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nfbnk" event={"ID":"c18c1638-a9d7-44ac-9c69-eb7d9d092c36","Type":"ContainerDied","Data":"7084030d84bd3fe4606957e78dd31f20dba7326a7d12605233fd5f00feaf8e80"} Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.347952 4799 scope.go:117] "RemoveContainer" containerID="c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.385062 4799 scope.go:117] "RemoveContainer" containerID="47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.388859 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nfbnk"] Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.398289 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nfbnk"] Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.415614 4799 scope.go:117] "RemoveContainer" containerID="99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.472104 4799 scope.go:117] "RemoveContainer" containerID="c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd" Mar 19 20:46:55 crc kubenswrapper[4799]: E0319 20:46:55.472832 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd\": container with ID starting with c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd not found: ID does not exist" containerID="c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.472895 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd"} err="failed to get container status \"c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd\": rpc error: code = NotFound desc = could not find container \"c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd\": container with ID starting with c060ea5f18f30600fdf1725dcac943dda0ad07979eb00d3208fd8e03ff5d03bd not found: ID does not exist" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.472928 4799 scope.go:117] "RemoveContainer" containerID="47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b" Mar 19 20:46:55 crc kubenswrapper[4799]: E0319 20:46:55.473338 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b\": container with ID starting with 47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b not found: ID does not exist" containerID="47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.473425 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b"} err="failed to get container status \"47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b\": rpc error: code = NotFound desc = could not find container \"47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b\": container with ID starting with 47f8a40cb617d69756db4b7e9cec8a832d90c3ce554bc9aa7b496bf200f9424b not found: ID does not exist" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.473468 4799 scope.go:117] "RemoveContainer" containerID="99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99" Mar 19 20:46:55 crc kubenswrapper[4799]: E0319 20:46:55.474104 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99\": container with ID starting with 99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99 not found: ID does not exist" containerID="99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99" Mar 19 20:46:55 crc kubenswrapper[4799]: I0319 20:46:55.474159 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99"} err="failed to get container status \"99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99\": rpc error: code = NotFound desc = could not find container \"99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99\": container with ID starting with 99534ddf1fcb7a0094c2775b82ab095d8e9a5c105989b1c78abb36b596ec4c99 not found: ID does not exist" Mar 19 20:46:57 crc kubenswrapper[4799]: I0319 20:46:57.136837 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" path="/var/lib/kubelet/pods/c18c1638-a9d7-44ac-9c69-eb7d9d092c36/volumes" Mar 19 20:46:58 crc kubenswrapper[4799]: I0319 20:46:58.756221 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:46:58 crc kubenswrapper[4799]: I0319 20:46:58.756806 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:46:58 crc kubenswrapper[4799]: I0319 20:46:58.756867 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:46:58 crc kubenswrapper[4799]: I0319 20:46:58.757880 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:46:58 crc kubenswrapper[4799]: I0319 20:46:58.757984 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" gracePeriod=600 Mar 19 20:46:58 crc kubenswrapper[4799]: E0319 20:46:58.895861 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:46:59 crc kubenswrapper[4799]: I0319 20:46:59.397991 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" exitCode=0 Mar 19 20:46:59 crc kubenswrapper[4799]: I0319 20:46:59.398039 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79"} Mar 19 20:46:59 crc kubenswrapper[4799]: I0319 20:46:59.398074 4799 scope.go:117] "RemoveContainer" containerID="2740eb9aae6bdd4350798a29265d5bfc5c2dd48e02f1b9ff9360379372fe8e04" Mar 19 20:46:59 crc kubenswrapper[4799]: I0319 20:46:59.398706 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:46:59 crc kubenswrapper[4799]: E0319 20:46:59.399183 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:47:00 crc kubenswrapper[4799]: I0319 20:47:00.020229 4799 scope.go:117] "RemoveContainer" containerID="ce7f276a5c90b63a961c7df42c5116a4e6ea781d30ace429263f8c3937097c68" Mar 19 20:47:13 crc kubenswrapper[4799]: I0319 20:47:13.129691 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:47:13 crc kubenswrapper[4799]: E0319 20:47:13.132226 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:47:27 crc kubenswrapper[4799]: I0319 20:47:27.117579 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:47:27 crc kubenswrapper[4799]: E0319 20:47:27.118898 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:47:42 crc kubenswrapper[4799]: I0319 20:47:42.116583 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:47:42 crc kubenswrapper[4799]: E0319 20:47:42.119067 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:47:49 crc kubenswrapper[4799]: I0319 20:47:49.015662 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" event={"ID":"d77bbb41-e2be-4c81-b266-92ca0b6e8b44","Type":"ContainerDied","Data":"e776d478e4800b944c9e977cbf3f152491907bc5d62d5b0c23a8a578e0f48568"} Mar 19 20:47:49 crc kubenswrapper[4799]: I0319 20:47:49.015606 4799 generic.go:334] "Generic (PLEG): container finished" podID="d77bbb41-e2be-4c81-b266-92ca0b6e8b44" containerID="e776d478e4800b944c9e977cbf3f152491907bc5d62d5b0c23a8a578e0f48568" exitCode=0 Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.594447 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.751778 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ssh-key-openstack-edpm-ipam\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.752349 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmhn7\" (UniqueName: \"kubernetes.io/projected/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-kube-api-access-tmhn7\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.752488 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-0\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.752580 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-inventory\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.752668 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-2\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.752716 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-1\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.752801 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-telemetry-combined-ca-bundle\") pod \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\" (UID: \"d77bbb41-e2be-4c81-b266-92ca0b6e8b44\") " Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.759973 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-kube-api-access-tmhn7" (OuterVolumeSpecName: "kube-api-access-tmhn7") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "kube-api-access-tmhn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.761020 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.791483 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-inventory" (OuterVolumeSpecName: "inventory") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.801764 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.806934 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.809049 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.810445 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d77bbb41-e2be-4c81-b266-92ca0b6e8b44" (UID: "d77bbb41-e2be-4c81-b266-92ca0b6e8b44"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.855883 4799 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.855937 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.855951 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmhn7\" (UniqueName: \"kubernetes.io/projected/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-kube-api-access-tmhn7\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.855965 4799 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.855979 4799 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.855992 4799 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:50 crc kubenswrapper[4799]: I0319 20:47:50.856006 4799 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d77bbb41-e2be-4c81-b266-92ca0b6e8b44-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 20:47:51 crc kubenswrapper[4799]: I0319 20:47:51.042476 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" event={"ID":"d77bbb41-e2be-4c81-b266-92ca0b6e8b44","Type":"ContainerDied","Data":"e77908521d6064cbbbc9398f22c1c430974704bb07aab85810225e7425e32004"} Mar 19 20:47:51 crc kubenswrapper[4799]: I0319 20:47:51.042590 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e77908521d6064cbbbc9398f22c1c430974704bb07aab85810225e7425e32004" Mar 19 20:47:51 crc kubenswrapper[4799]: I0319 20:47:51.042630 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-j27sk" Mar 19 20:47:57 crc kubenswrapper[4799]: I0319 20:47:57.117946 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:47:57 crc kubenswrapper[4799]: E0319 20:47:57.119019 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.168967 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565888-dmgh6"] Mar 19 20:48:00 crc kubenswrapper[4799]: E0319 20:48:00.170445 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="extract-utilities" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.170475 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="extract-utilities" Mar 19 20:48:00 crc kubenswrapper[4799]: E0319 20:48:00.170497 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="extract-content" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.170511 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="extract-content" Mar 19 20:48:00 crc kubenswrapper[4799]: E0319 20:48:00.170551 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77bbb41-e2be-4c81-b266-92ca0b6e8b44" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.170567 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77bbb41-e2be-4c81-b266-92ca0b6e8b44" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 20:48:00 crc kubenswrapper[4799]: E0319 20:48:00.170602 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="registry-server" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.170615 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="registry-server" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.170988 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18c1638-a9d7-44ac-9c69-eb7d9d092c36" containerName="registry-server" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.171021 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77bbb41-e2be-4c81-b266-92ca0b6e8b44" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.172154 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.174547 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.175173 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.175372 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.187866 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565888-dmgh6"] Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.274577 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtwd\" (UniqueName: \"kubernetes.io/projected/8711971f-7512-4f38-a0b8-c3a11e9c2245-kube-api-access-9qtwd\") pod \"auto-csr-approver-29565888-dmgh6\" (UID: \"8711971f-7512-4f38-a0b8-c3a11e9c2245\") " pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.376359 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtwd\" (UniqueName: \"kubernetes.io/projected/8711971f-7512-4f38-a0b8-c3a11e9c2245-kube-api-access-9qtwd\") pod \"auto-csr-approver-29565888-dmgh6\" (UID: \"8711971f-7512-4f38-a0b8-c3a11e9c2245\") " pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.423363 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtwd\" (UniqueName: \"kubernetes.io/projected/8711971f-7512-4f38-a0b8-c3a11e9c2245-kube-api-access-9qtwd\") pod \"auto-csr-approver-29565888-dmgh6\" (UID: \"8711971f-7512-4f38-a0b8-c3a11e9c2245\") " pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:00 crc kubenswrapper[4799]: I0319 20:48:00.512504 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:01 crc kubenswrapper[4799]: I0319 20:48:01.051752 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565888-dmgh6"] Mar 19 20:48:01 crc kubenswrapper[4799]: I0319 20:48:01.054499 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:48:01 crc kubenswrapper[4799]: I0319 20:48:01.171258 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" event={"ID":"8711971f-7512-4f38-a0b8-c3a11e9c2245","Type":"ContainerStarted","Data":"9af7da13fef15819f71433edfed11d9137f6e152667fa614d6a342f4c67b5726"} Mar 19 20:48:03 crc kubenswrapper[4799]: I0319 20:48:03.196296 4799 generic.go:334] "Generic (PLEG): container finished" podID="8711971f-7512-4f38-a0b8-c3a11e9c2245" containerID="81521428c9a43af2087f1d71fa542ecb1e21cba0f6a3e40525aa0a96648cb14f" exitCode=0 Mar 19 20:48:03 crc kubenswrapper[4799]: I0319 20:48:03.196459 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" event={"ID":"8711971f-7512-4f38-a0b8-c3a11e9c2245","Type":"ContainerDied","Data":"81521428c9a43af2087f1d71fa542ecb1e21cba0f6a3e40525aa0a96648cb14f"} Mar 19 20:48:04 crc kubenswrapper[4799]: I0319 20:48:04.581082 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:04 crc kubenswrapper[4799]: I0319 20:48:04.673901 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qtwd\" (UniqueName: \"kubernetes.io/projected/8711971f-7512-4f38-a0b8-c3a11e9c2245-kube-api-access-9qtwd\") pod \"8711971f-7512-4f38-a0b8-c3a11e9c2245\" (UID: \"8711971f-7512-4f38-a0b8-c3a11e9c2245\") " Mar 19 20:48:04 crc kubenswrapper[4799]: I0319 20:48:04.682600 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8711971f-7512-4f38-a0b8-c3a11e9c2245-kube-api-access-9qtwd" (OuterVolumeSpecName: "kube-api-access-9qtwd") pod "8711971f-7512-4f38-a0b8-c3a11e9c2245" (UID: "8711971f-7512-4f38-a0b8-c3a11e9c2245"). InnerVolumeSpecName "kube-api-access-9qtwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:48:04 crc kubenswrapper[4799]: I0319 20:48:04.778564 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qtwd\" (UniqueName: \"kubernetes.io/projected/8711971f-7512-4f38-a0b8-c3a11e9c2245-kube-api-access-9qtwd\") on node \"crc\" DevicePath \"\"" Mar 19 20:48:05 crc kubenswrapper[4799]: I0319 20:48:05.219268 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" event={"ID":"8711971f-7512-4f38-a0b8-c3a11e9c2245","Type":"ContainerDied","Data":"9af7da13fef15819f71433edfed11d9137f6e152667fa614d6a342f4c67b5726"} Mar 19 20:48:05 crc kubenswrapper[4799]: I0319 20:48:05.219615 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af7da13fef15819f71433edfed11d9137f6e152667fa614d6a342f4c67b5726" Mar 19 20:48:05 crc kubenswrapper[4799]: I0319 20:48:05.219336 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565888-dmgh6" Mar 19 20:48:05 crc kubenswrapper[4799]: I0319 20:48:05.680729 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565882-s8rnc"] Mar 19 20:48:05 crc kubenswrapper[4799]: I0319 20:48:05.692921 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565882-s8rnc"] Mar 19 20:48:07 crc kubenswrapper[4799]: I0319 20:48:07.136961 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68221cef-725e-4911-a651-39fc18fc5715" path="/var/lib/kubelet/pods/68221cef-725e-4911-a651-39fc18fc5715/volumes" Mar 19 20:48:08 crc kubenswrapper[4799]: I0319 20:48:08.116807 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:48:08 crc kubenswrapper[4799]: E0319 20:48:08.117198 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:48:19 crc kubenswrapper[4799]: I0319 20:48:19.116927 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:48:19 crc kubenswrapper[4799]: E0319 20:48:19.117822 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:48:32 crc kubenswrapper[4799]: I0319 20:48:32.116257 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:48:32 crc kubenswrapper[4799]: E0319 20:48:32.117484 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:48:43 crc kubenswrapper[4799]: I0319 20:48:43.124698 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:48:43 crc kubenswrapper[4799]: E0319 20:48:43.125832 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.783931 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:48:47 crc kubenswrapper[4799]: E0319 20:48:47.784848 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8711971f-7512-4f38-a0b8-c3a11e9c2245" containerName="oc" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.784870 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8711971f-7512-4f38-a0b8-c3a11e9c2245" containerName="oc" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.785198 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8711971f-7512-4f38-a0b8-c3a11e9c2245" containerName="oc" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.786213 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.790001 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wxdjz" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.790361 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.790659 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.792026 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.800974 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933008 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-config-data\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933322 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933345 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7pq4\" (UniqueName: \"kubernetes.io/projected/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-kube-api-access-j7pq4\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933482 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933639 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933756 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933923 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.933963 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:47 crc kubenswrapper[4799]: I0319 20:48:47.934081 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035595 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035649 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035691 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035710 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035748 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035809 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-config-data\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035838 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035856 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7pq4\" (UniqueName: \"kubernetes.io/projected/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-kube-api-access-j7pq4\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.035890 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.036283 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.037331 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.037740 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.037893 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-config-data\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.037926 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.044194 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.048781 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.049771 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.056755 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7pq4\" (UniqueName: \"kubernetes.io/projected/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-kube-api-access-j7pq4\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.071944 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.144879 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.656266 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:48:48 crc kubenswrapper[4799]: I0319 20:48:48.794845 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a","Type":"ContainerStarted","Data":"b748b240e734a874366a43fddb43f8c1fcfbcb51043d5e306f578a35f35668ab"} Mar 19 20:48:58 crc kubenswrapper[4799]: I0319 20:48:58.115949 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:48:58 crc kubenswrapper[4799]: E0319 20:48:58.118120 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:49:00 crc kubenswrapper[4799]: I0319 20:49:00.172509 4799 scope.go:117] "RemoveContainer" containerID="d56b1d204805ff7b97a5a58b2051903d9c796290f5371df56415d584fcff7094" Mar 19 20:49:11 crc kubenswrapper[4799]: I0319 20:49:11.116578 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:49:11 crc kubenswrapper[4799]: E0319 20:49:11.117650 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:49:20 crc kubenswrapper[4799]: E0319 20:49:20.406525 4799 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 19 20:49:20 crc kubenswrapper[4799]: E0319 20:49:20.407024 4799 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j7pq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(66572ad6-a9d4-4dc7-ae3e-61a1d67d928a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:49:20 crc kubenswrapper[4799]: E0319 20:49:20.408186 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" Mar 19 20:49:21 crc kubenswrapper[4799]: E0319 20:49:21.118135 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" Mar 19 20:49:24 crc kubenswrapper[4799]: I0319 20:49:24.117767 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:49:24 crc kubenswrapper[4799]: E0319 20:49:24.119562 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:49:34 crc kubenswrapper[4799]: I0319 20:49:34.630684 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 20:49:36 crc kubenswrapper[4799]: I0319 20:49:36.267728 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a","Type":"ContainerStarted","Data":"8546b77afa6197cbd043fd216e8bccc4b5fca9b9c0891b4e0911c98d29c84dba"} Mar 19 20:49:36 crc kubenswrapper[4799]: I0319 20:49:36.297865 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.32796884 podStartE2EDuration="50.297841467s" podCreationTimestamp="2026-03-19 20:48:46 +0000 UTC" firstStartedPulling="2026-03-19 20:48:48.657773936 +0000 UTC m=+2606.263727008" lastFinishedPulling="2026-03-19 20:49:34.627646523 +0000 UTC m=+2652.233599635" observedRunningTime="2026-03-19 20:49:36.293128685 +0000 UTC m=+2653.899081757" watchObservedRunningTime="2026-03-19 20:49:36.297841467 +0000 UTC m=+2653.903794579" Mar 19 20:49:38 crc kubenswrapper[4799]: I0319 20:49:38.115965 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:49:38 crc kubenswrapper[4799]: E0319 20:49:38.116531 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:49:50 crc kubenswrapper[4799]: I0319 20:49:50.116917 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:49:50 crc kubenswrapper[4799]: E0319 20:49:50.121650 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.141894 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565890-dtft2"] Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.143711 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.145440 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.145559 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.146213 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.155016 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565890-dtft2"] Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.224089 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4d2\" (UniqueName: \"kubernetes.io/projected/943724bd-8de6-49cd-b029-ca39c51b4444-kube-api-access-ls4d2\") pod \"auto-csr-approver-29565890-dtft2\" (UID: \"943724bd-8de6-49cd-b029-ca39c51b4444\") " pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.326746 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4d2\" (UniqueName: \"kubernetes.io/projected/943724bd-8de6-49cd-b029-ca39c51b4444-kube-api-access-ls4d2\") pod \"auto-csr-approver-29565890-dtft2\" (UID: \"943724bd-8de6-49cd-b029-ca39c51b4444\") " pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.350254 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4d2\" (UniqueName: \"kubernetes.io/projected/943724bd-8de6-49cd-b029-ca39c51b4444-kube-api-access-ls4d2\") pod \"auto-csr-approver-29565890-dtft2\" (UID: \"943724bd-8de6-49cd-b029-ca39c51b4444\") " pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.495729 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:00 crc kubenswrapper[4799]: I0319 20:50:00.965605 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565890-dtft2"] Mar 19 20:50:00 crc kubenswrapper[4799]: W0319 20:50:00.976654 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943724bd_8de6_49cd_b029_ca39c51b4444.slice/crio-de74d8c8f0844acfc3ccb61bd29edef071726297f1c93d1a2fdf50e1fb0caf46 WatchSource:0}: Error finding container de74d8c8f0844acfc3ccb61bd29edef071726297f1c93d1a2fdf50e1fb0caf46: Status 404 returned error can't find the container with id de74d8c8f0844acfc3ccb61bd29edef071726297f1c93d1a2fdf50e1fb0caf46 Mar 19 20:50:01 crc kubenswrapper[4799]: I0319 20:50:01.116756 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:50:01 crc kubenswrapper[4799]: E0319 20:50:01.117687 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:50:01 crc kubenswrapper[4799]: I0319 20:50:01.536720 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565890-dtft2" event={"ID":"943724bd-8de6-49cd-b029-ca39c51b4444","Type":"ContainerStarted","Data":"de74d8c8f0844acfc3ccb61bd29edef071726297f1c93d1a2fdf50e1fb0caf46"} Mar 19 20:50:04 crc kubenswrapper[4799]: I0319 20:50:04.582259 4799 generic.go:334] "Generic (PLEG): container finished" podID="943724bd-8de6-49cd-b029-ca39c51b4444" containerID="cb678004ef9e2ecb1e344d84b09bc75eb5c686425163799338f912070ed43ee1" exitCode=0 Mar 19 20:50:04 crc kubenswrapper[4799]: I0319 20:50:04.583050 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565890-dtft2" event={"ID":"943724bd-8de6-49cd-b029-ca39c51b4444","Type":"ContainerDied","Data":"cb678004ef9e2ecb1e344d84b09bc75eb5c686425163799338f912070ed43ee1"} Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.009013 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.141164 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls4d2\" (UniqueName: \"kubernetes.io/projected/943724bd-8de6-49cd-b029-ca39c51b4444-kube-api-access-ls4d2\") pod \"943724bd-8de6-49cd-b029-ca39c51b4444\" (UID: \"943724bd-8de6-49cd-b029-ca39c51b4444\") " Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.148640 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943724bd-8de6-49cd-b029-ca39c51b4444-kube-api-access-ls4d2" (OuterVolumeSpecName: "kube-api-access-ls4d2") pod "943724bd-8de6-49cd-b029-ca39c51b4444" (UID: "943724bd-8de6-49cd-b029-ca39c51b4444"). InnerVolumeSpecName "kube-api-access-ls4d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.244042 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls4d2\" (UniqueName: \"kubernetes.io/projected/943724bd-8de6-49cd-b029-ca39c51b4444-kube-api-access-ls4d2\") on node \"crc\" DevicePath \"\"" Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.602059 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565890-dtft2" event={"ID":"943724bd-8de6-49cd-b029-ca39c51b4444","Type":"ContainerDied","Data":"de74d8c8f0844acfc3ccb61bd29edef071726297f1c93d1a2fdf50e1fb0caf46"} Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.602272 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de74d8c8f0844acfc3ccb61bd29edef071726297f1c93d1a2fdf50e1fb0caf46" Mar 19 20:50:06 crc kubenswrapper[4799]: I0319 20:50:06.602321 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565890-dtft2" Mar 19 20:50:07 crc kubenswrapper[4799]: I0319 20:50:07.094563 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565884-s2x8j"] Mar 19 20:50:07 crc kubenswrapper[4799]: I0319 20:50:07.104566 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565884-s2x8j"] Mar 19 20:50:07 crc kubenswrapper[4799]: I0319 20:50:07.128180 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd4b2ca-7025-43b4-82b0-26fd43c60018" path="/var/lib/kubelet/pods/dcd4b2ca-7025-43b4-82b0-26fd43c60018/volumes" Mar 19 20:50:16 crc kubenswrapper[4799]: I0319 20:50:16.117944 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:50:16 crc kubenswrapper[4799]: E0319 20:50:16.118645 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:50:20 crc kubenswrapper[4799]: I0319 20:50:20.361651 4799 scope.go:117] "RemoveContainer" containerID="81c7afe2a81edf7da56e6c21fd337a44316e802c9342ad22ba8a6a7f8d273ab2" Mar 19 20:50:28 crc kubenswrapper[4799]: I0319 20:50:28.116373 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:50:28 crc kubenswrapper[4799]: E0319 20:50:28.117775 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:50:41 crc kubenswrapper[4799]: I0319 20:50:41.116696 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:50:41 crc kubenswrapper[4799]: E0319 20:50:41.117734 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:50:53 crc kubenswrapper[4799]: I0319 20:50:53.123851 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:50:53 crc kubenswrapper[4799]: E0319 20:50:53.126116 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:51:06 crc kubenswrapper[4799]: I0319 20:51:06.116701 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:51:06 crc kubenswrapper[4799]: E0319 20:51:06.118518 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:51:17 crc kubenswrapper[4799]: I0319 20:51:17.116432 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:51:17 crc kubenswrapper[4799]: E0319 20:51:17.117336 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:51:28 crc kubenswrapper[4799]: I0319 20:51:28.116267 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:51:28 crc kubenswrapper[4799]: E0319 20:51:28.117246 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.223645 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gcvl8"] Mar 19 20:51:35 crc kubenswrapper[4799]: E0319 20:51:35.225229 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943724bd-8de6-49cd-b029-ca39c51b4444" containerName="oc" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.225261 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="943724bd-8de6-49cd-b029-ca39c51b4444" containerName="oc" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.225809 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="943724bd-8de6-49cd-b029-ca39c51b4444" containerName="oc" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.228944 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.238558 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gcvl8"] Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.327274 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-utilities\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.327587 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-catalog-content\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.327623 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7c8n\" (UniqueName: \"kubernetes.io/projected/a0b7549c-f1c7-4098-a109-5586d0b86e20-kube-api-access-f7c8n\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.429505 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-catalog-content\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.429570 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7c8n\" (UniqueName: \"kubernetes.io/projected/a0b7549c-f1c7-4098-a109-5586d0b86e20-kube-api-access-f7c8n\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.429615 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-utilities\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.430239 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-utilities\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.430544 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-catalog-content\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.452924 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7c8n\" (UniqueName: \"kubernetes.io/projected/a0b7549c-f1c7-4098-a109-5586d0b86e20-kube-api-access-f7c8n\") pod \"redhat-operators-gcvl8\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:35 crc kubenswrapper[4799]: I0319 20:51:35.567865 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:36 crc kubenswrapper[4799]: I0319 20:51:36.543057 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gcvl8"] Mar 19 20:51:36 crc kubenswrapper[4799]: W0319 20:51:36.549682 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0b7549c_f1c7_4098_a109_5586d0b86e20.slice/crio-a47ba411091f7d014c5c411520f3c4fd14c57e77b2da554168ac2f3fb0c8f476 WatchSource:0}: Error finding container a47ba411091f7d014c5c411520f3c4fd14c57e77b2da554168ac2f3fb0c8f476: Status 404 returned error can't find the container with id a47ba411091f7d014c5c411520f3c4fd14c57e77b2da554168ac2f3fb0c8f476 Mar 19 20:51:36 crc kubenswrapper[4799]: I0319 20:51:36.948249 4799 generic.go:334] "Generic (PLEG): container finished" podID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerID="577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3" exitCode=0 Mar 19 20:51:36 crc kubenswrapper[4799]: I0319 20:51:36.948340 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerDied","Data":"577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3"} Mar 19 20:51:36 crc kubenswrapper[4799]: I0319 20:51:36.948548 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerStarted","Data":"a47ba411091f7d014c5c411520f3c4fd14c57e77b2da554168ac2f3fb0c8f476"} Mar 19 20:51:40 crc kubenswrapper[4799]: I0319 20:51:40.989768 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerStarted","Data":"85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621"} Mar 19 20:51:43 crc kubenswrapper[4799]: I0319 20:51:43.015831 4799 generic.go:334] "Generic (PLEG): container finished" podID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerID="85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621" exitCode=0 Mar 19 20:51:43 crc kubenswrapper[4799]: I0319 20:51:43.016455 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerDied","Data":"85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621"} Mar 19 20:51:43 crc kubenswrapper[4799]: I0319 20:51:43.129009 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:51:43 crc kubenswrapper[4799]: E0319 20:51:43.129797 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:51:44 crc kubenswrapper[4799]: I0319 20:51:44.027412 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerStarted","Data":"2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972"} Mar 19 20:51:44 crc kubenswrapper[4799]: I0319 20:51:44.050817 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gcvl8" podStartSLOduration=2.531371425 podStartE2EDuration="9.050794762s" podCreationTimestamp="2026-03-19 20:51:35 +0000 UTC" firstStartedPulling="2026-03-19 20:51:36.95070571 +0000 UTC m=+2774.556658782" lastFinishedPulling="2026-03-19 20:51:43.470129037 +0000 UTC m=+2781.076082119" observedRunningTime="2026-03-19 20:51:44.046255205 +0000 UTC m=+2781.652208287" watchObservedRunningTime="2026-03-19 20:51:44.050794762 +0000 UTC m=+2781.656747834" Mar 19 20:51:45 crc kubenswrapper[4799]: I0319 20:51:45.568272 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:45 crc kubenswrapper[4799]: I0319 20:51:45.568761 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:51:46 crc kubenswrapper[4799]: I0319 20:51:46.643086 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gcvl8" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="registry-server" probeResult="failure" output=< Mar 19 20:51:46 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:51:46 crc kubenswrapper[4799]: > Mar 19 20:51:55 crc kubenswrapper[4799]: I0319 20:51:55.116559 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:51:55 crc kubenswrapper[4799]: E0319 20:51:55.117708 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:51:56 crc kubenswrapper[4799]: I0319 20:51:56.639688 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gcvl8" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="registry-server" probeResult="failure" output=< Mar 19 20:51:56 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 20:51:56 crc kubenswrapper[4799]: > Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.148408 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565892-fgq5j"] Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.150090 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.153152 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.153695 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.156338 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.175736 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565892-fgq5j"] Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.225616 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49s2\" (UniqueName: \"kubernetes.io/projected/d8f8c752-4955-4d9e-afc3-0163858f464b-kube-api-access-b49s2\") pod \"auto-csr-approver-29565892-fgq5j\" (UID: \"d8f8c752-4955-4d9e-afc3-0163858f464b\") " pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.327605 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49s2\" (UniqueName: \"kubernetes.io/projected/d8f8c752-4955-4d9e-afc3-0163858f464b-kube-api-access-b49s2\") pod \"auto-csr-approver-29565892-fgq5j\" (UID: \"d8f8c752-4955-4d9e-afc3-0163858f464b\") " pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.357799 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49s2\" (UniqueName: \"kubernetes.io/projected/d8f8c752-4955-4d9e-afc3-0163858f464b-kube-api-access-b49s2\") pod \"auto-csr-approver-29565892-fgq5j\" (UID: \"d8f8c752-4955-4d9e-afc3-0163858f464b\") " pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.472141 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:00 crc kubenswrapper[4799]: I0319 20:52:00.953412 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565892-fgq5j"] Mar 19 20:52:01 crc kubenswrapper[4799]: I0319 20:52:01.221250 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" event={"ID":"d8f8c752-4955-4d9e-afc3-0163858f464b","Type":"ContainerStarted","Data":"81816b1d46f90973c8431bb2576f37fe6147660450305c584ab6c915b1e0e14c"} Mar 19 20:52:02 crc kubenswrapper[4799]: I0319 20:52:02.231031 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" event={"ID":"d8f8c752-4955-4d9e-afc3-0163858f464b","Type":"ContainerStarted","Data":"f01a47bd98e0bf0ae6328cb420f1700edfca58c1557266df7c91957ae9690a2b"} Mar 19 20:52:02 crc kubenswrapper[4799]: I0319 20:52:02.256218 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" podStartSLOduration=1.304474019 podStartE2EDuration="2.256198769s" podCreationTimestamp="2026-03-19 20:52:00 +0000 UTC" firstStartedPulling="2026-03-19 20:52:00.954518132 +0000 UTC m=+2798.560471214" lastFinishedPulling="2026-03-19 20:52:01.906242892 +0000 UTC m=+2799.512195964" observedRunningTime="2026-03-19 20:52:02.246139137 +0000 UTC m=+2799.852092209" watchObservedRunningTime="2026-03-19 20:52:02.256198769 +0000 UTC m=+2799.862151861" Mar 19 20:52:03 crc kubenswrapper[4799]: I0319 20:52:03.245577 4799 generic.go:334] "Generic (PLEG): container finished" podID="d8f8c752-4955-4d9e-afc3-0163858f464b" containerID="f01a47bd98e0bf0ae6328cb420f1700edfca58c1557266df7c91957ae9690a2b" exitCode=0 Mar 19 20:52:03 crc kubenswrapper[4799]: I0319 20:52:03.245693 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" event={"ID":"d8f8c752-4955-4d9e-afc3-0163858f464b","Type":"ContainerDied","Data":"f01a47bd98e0bf0ae6328cb420f1700edfca58c1557266df7c91957ae9690a2b"} Mar 19 20:52:04 crc kubenswrapper[4799]: I0319 20:52:04.698976 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:04 crc kubenswrapper[4799]: I0319 20:52:04.718210 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49s2\" (UniqueName: \"kubernetes.io/projected/d8f8c752-4955-4d9e-afc3-0163858f464b-kube-api-access-b49s2\") pod \"d8f8c752-4955-4d9e-afc3-0163858f464b\" (UID: \"d8f8c752-4955-4d9e-afc3-0163858f464b\") " Mar 19 20:52:04 crc kubenswrapper[4799]: I0319 20:52:04.727306 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f8c752-4955-4d9e-afc3-0163858f464b-kube-api-access-b49s2" (OuterVolumeSpecName: "kube-api-access-b49s2") pod "d8f8c752-4955-4d9e-afc3-0163858f464b" (UID: "d8f8c752-4955-4d9e-afc3-0163858f464b"). InnerVolumeSpecName "kube-api-access-b49s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:52:04 crc kubenswrapper[4799]: I0319 20:52:04.823146 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49s2\" (UniqueName: \"kubernetes.io/projected/d8f8c752-4955-4d9e-afc3-0163858f464b-kube-api-access-b49s2\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.264710 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" event={"ID":"d8f8c752-4955-4d9e-afc3-0163858f464b","Type":"ContainerDied","Data":"81816b1d46f90973c8431bb2576f37fe6147660450305c584ab6c915b1e0e14c"} Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.265190 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81816b1d46f90973c8431bb2576f37fe6147660450305c584ab6c915b1e0e14c" Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.264915 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565892-fgq5j" Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.319794 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565886-hlhm2"] Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.327504 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565886-hlhm2"] Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.664739 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:52:05 crc kubenswrapper[4799]: I0319 20:52:05.735962 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:52:06 crc kubenswrapper[4799]: I0319 20:52:06.416689 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gcvl8"] Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.138348 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7" path="/var/lib/kubelet/pods/5ed2bde4-e5fc-4d6b-a1b5-f362c36497a7/volumes" Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.286104 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gcvl8" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="registry-server" containerID="cri-o://2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972" gracePeriod=2 Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.868818 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.988429 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-catalog-content\") pod \"a0b7549c-f1c7-4098-a109-5586d0b86e20\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.988527 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-utilities\") pod \"a0b7549c-f1c7-4098-a109-5586d0b86e20\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.988594 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7c8n\" (UniqueName: \"kubernetes.io/projected/a0b7549c-f1c7-4098-a109-5586d0b86e20-kube-api-access-f7c8n\") pod \"a0b7549c-f1c7-4098-a109-5586d0b86e20\" (UID: \"a0b7549c-f1c7-4098-a109-5586d0b86e20\") " Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.989360 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-utilities" (OuterVolumeSpecName: "utilities") pod "a0b7549c-f1c7-4098-a109-5586d0b86e20" (UID: "a0b7549c-f1c7-4098-a109-5586d0b86e20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:52:07 crc kubenswrapper[4799]: I0319 20:52:07.998547 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0b7549c-f1c7-4098-a109-5586d0b86e20-kube-api-access-f7c8n" (OuterVolumeSpecName: "kube-api-access-f7c8n") pod "a0b7549c-f1c7-4098-a109-5586d0b86e20" (UID: "a0b7549c-f1c7-4098-a109-5586d0b86e20"). InnerVolumeSpecName "kube-api-access-f7c8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.091589 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7c8n\" (UniqueName: \"kubernetes.io/projected/a0b7549c-f1c7-4098-a109-5586d0b86e20-kube-api-access-f7c8n\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.091635 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.117604 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.215100 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0b7549c-f1c7-4098-a109-5586d0b86e20" (UID: "a0b7549c-f1c7-4098-a109-5586d0b86e20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.298033 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0b7549c-f1c7-4098-a109-5586d0b86e20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.298980 4799 generic.go:334] "Generic (PLEG): container finished" podID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerID="2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972" exitCode=0 Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.299026 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerDied","Data":"2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972"} Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.299090 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gcvl8" event={"ID":"a0b7549c-f1c7-4098-a109-5586d0b86e20","Type":"ContainerDied","Data":"a47ba411091f7d014c5c411520f3c4fd14c57e77b2da554168ac2f3fb0c8f476"} Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.299117 4799 scope.go:117] "RemoveContainer" containerID="2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.299275 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gcvl8" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.333513 4799 scope.go:117] "RemoveContainer" containerID="85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.354717 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gcvl8"] Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.367781 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gcvl8"] Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.378590 4799 scope.go:117] "RemoveContainer" containerID="577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.407857 4799 scope.go:117] "RemoveContainer" containerID="2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972" Mar 19 20:52:08 crc kubenswrapper[4799]: E0319 20:52:08.408401 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972\": container with ID starting with 2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972 not found: ID does not exist" containerID="2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.408452 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972"} err="failed to get container status \"2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972\": rpc error: code = NotFound desc = could not find container \"2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972\": container with ID starting with 2d5acb06e5613b567a9be9f2fd6d98a220bac8cce79ec71f97f1e020398ac972 not found: ID does not exist" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.408483 4799 scope.go:117] "RemoveContainer" containerID="85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621" Mar 19 20:52:08 crc kubenswrapper[4799]: E0319 20:52:08.409228 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621\": container with ID starting with 85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621 not found: ID does not exist" containerID="85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.409268 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621"} err="failed to get container status \"85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621\": rpc error: code = NotFound desc = could not find container \"85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621\": container with ID starting with 85de125332c7ef960e6225d9d588c875a5651d93da0ccb42f6723fdf5785b621 not found: ID does not exist" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.409297 4799 scope.go:117] "RemoveContainer" containerID="577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3" Mar 19 20:52:08 crc kubenswrapper[4799]: E0319 20:52:08.409713 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3\": container with ID starting with 577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3 not found: ID does not exist" containerID="577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3" Mar 19 20:52:08 crc kubenswrapper[4799]: I0319 20:52:08.409758 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3"} err="failed to get container status \"577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3\": rpc error: code = NotFound desc = could not find container \"577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3\": container with ID starting with 577618253ead2bf9a2af22c0e5d7ef3159e39715380ff7dfe783ddaab19600d3 not found: ID does not exist" Mar 19 20:52:09 crc kubenswrapper[4799]: I0319 20:52:09.129537 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" path="/var/lib/kubelet/pods/a0b7549c-f1c7-4098-a109-5586d0b86e20/volumes" Mar 19 20:52:09 crc kubenswrapper[4799]: I0319 20:52:09.318292 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"babe0dde2e8ae616818bc758cddcd4e3f5c5c20e0c622347f9b7f7191cc4b383"} Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.000295 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vggct"] Mar 19 20:52:12 crc kubenswrapper[4799]: E0319 20:52:12.001455 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="extract-utilities" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.001477 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="extract-utilities" Mar 19 20:52:12 crc kubenswrapper[4799]: E0319 20:52:12.001497 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="extract-content" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.001509 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="extract-content" Mar 19 20:52:12 crc kubenswrapper[4799]: E0319 20:52:12.001532 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="registry-server" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.001546 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="registry-server" Mar 19 20:52:12 crc kubenswrapper[4799]: E0319 20:52:12.001592 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f8c752-4955-4d9e-afc3-0163858f464b" containerName="oc" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.001607 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f8c752-4955-4d9e-afc3-0163858f464b" containerName="oc" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.001934 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f8c752-4955-4d9e-afc3-0163858f464b" containerName="oc" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.001962 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0b7549c-f1c7-4098-a109-5586d0b86e20" containerName="registry-server" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.004266 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.023243 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vggct"] Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.128057 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-catalog-content\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.128452 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-utilities\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.128489 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgm9d\" (UniqueName: \"kubernetes.io/projected/98154566-e24f-4db6-8c88-5865be0b6155-kube-api-access-rgm9d\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.230485 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-catalog-content\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.230550 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-utilities\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.230586 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgm9d\" (UniqueName: \"kubernetes.io/projected/98154566-e24f-4db6-8c88-5865be0b6155-kube-api-access-rgm9d\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.230943 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-catalog-content\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.231298 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-utilities\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.250631 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgm9d\" (UniqueName: \"kubernetes.io/projected/98154566-e24f-4db6-8c88-5865be0b6155-kube-api-access-rgm9d\") pod \"certified-operators-vggct\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.328660 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:12 crc kubenswrapper[4799]: I0319 20:52:12.804054 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vggct"] Mar 19 20:52:12 crc kubenswrapper[4799]: W0319 20:52:12.822551 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98154566_e24f_4db6_8c88_5865be0b6155.slice/crio-5d9b1a426d2b46fd612e20bbbc8117dfebcd86beae225fcf81f03527bab75c39 WatchSource:0}: Error finding container 5d9b1a426d2b46fd612e20bbbc8117dfebcd86beae225fcf81f03527bab75c39: Status 404 returned error can't find the container with id 5d9b1a426d2b46fd612e20bbbc8117dfebcd86beae225fcf81f03527bab75c39 Mar 19 20:52:13 crc kubenswrapper[4799]: I0319 20:52:13.380913 4799 generic.go:334] "Generic (PLEG): container finished" podID="98154566-e24f-4db6-8c88-5865be0b6155" containerID="fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2" exitCode=0 Mar 19 20:52:13 crc kubenswrapper[4799]: I0319 20:52:13.380984 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerDied","Data":"fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2"} Mar 19 20:52:13 crc kubenswrapper[4799]: I0319 20:52:13.381434 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerStarted","Data":"5d9b1a426d2b46fd612e20bbbc8117dfebcd86beae225fcf81f03527bab75c39"} Mar 19 20:52:14 crc kubenswrapper[4799]: I0319 20:52:14.397510 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerStarted","Data":"93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7"} Mar 19 20:52:16 crc kubenswrapper[4799]: I0319 20:52:16.423091 4799 generic.go:334] "Generic (PLEG): container finished" podID="98154566-e24f-4db6-8c88-5865be0b6155" containerID="93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7" exitCode=0 Mar 19 20:52:16 crc kubenswrapper[4799]: I0319 20:52:16.423201 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerDied","Data":"93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7"} Mar 19 20:52:17 crc kubenswrapper[4799]: I0319 20:52:17.433967 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerStarted","Data":"eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7"} Mar 19 20:52:17 crc kubenswrapper[4799]: I0319 20:52:17.474264 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vggct" podStartSLOduration=3.032913437 podStartE2EDuration="6.474237776s" podCreationTimestamp="2026-03-19 20:52:11 +0000 UTC" firstStartedPulling="2026-03-19 20:52:13.383537669 +0000 UTC m=+2810.989490781" lastFinishedPulling="2026-03-19 20:52:16.824862058 +0000 UTC m=+2814.430815120" observedRunningTime="2026-03-19 20:52:17.459772021 +0000 UTC m=+2815.065725123" watchObservedRunningTime="2026-03-19 20:52:17.474237776 +0000 UTC m=+2815.080190888" Mar 19 20:52:20 crc kubenswrapper[4799]: I0319 20:52:20.833765 4799 scope.go:117] "RemoveContainer" containerID="8ec7d3e47f37efbe7a74979f4d5a811ccdb5df4462ecfe7f8fb3d637bc923926" Mar 19 20:52:22 crc kubenswrapper[4799]: I0319 20:52:22.329429 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:22 crc kubenswrapper[4799]: I0319 20:52:22.329732 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:22 crc kubenswrapper[4799]: I0319 20:52:22.398684 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:22 crc kubenswrapper[4799]: I0319 20:52:22.539919 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:22 crc kubenswrapper[4799]: I0319 20:52:22.643407 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vggct"] Mar 19 20:52:24 crc kubenswrapper[4799]: I0319 20:52:24.506945 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vggct" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="registry-server" containerID="cri-o://eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7" gracePeriod=2 Mar 19 20:52:24 crc kubenswrapper[4799]: I0319 20:52:24.977546 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.121636 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-catalog-content\") pod \"98154566-e24f-4db6-8c88-5865be0b6155\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.122081 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgm9d\" (UniqueName: \"kubernetes.io/projected/98154566-e24f-4db6-8c88-5865be0b6155-kube-api-access-rgm9d\") pod \"98154566-e24f-4db6-8c88-5865be0b6155\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.122179 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-utilities\") pod \"98154566-e24f-4db6-8c88-5865be0b6155\" (UID: \"98154566-e24f-4db6-8c88-5865be0b6155\") " Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.123271 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-utilities" (OuterVolumeSpecName: "utilities") pod "98154566-e24f-4db6-8c88-5865be0b6155" (UID: "98154566-e24f-4db6-8c88-5865be0b6155"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.131957 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98154566-e24f-4db6-8c88-5865be0b6155-kube-api-access-rgm9d" (OuterVolumeSpecName: "kube-api-access-rgm9d") pod "98154566-e24f-4db6-8c88-5865be0b6155" (UID: "98154566-e24f-4db6-8c88-5865be0b6155"). InnerVolumeSpecName "kube-api-access-rgm9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.224887 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgm9d\" (UniqueName: \"kubernetes.io/projected/98154566-e24f-4db6-8c88-5865be0b6155-kube-api-access-rgm9d\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.225063 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.336258 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98154566-e24f-4db6-8c88-5865be0b6155" (UID: "98154566-e24f-4db6-8c88-5865be0b6155"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.428778 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98154566-e24f-4db6-8c88-5865be0b6155-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.519656 4799 generic.go:334] "Generic (PLEG): container finished" podID="98154566-e24f-4db6-8c88-5865be0b6155" containerID="eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7" exitCode=0 Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.519700 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerDied","Data":"eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7"} Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.519726 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vggct" event={"ID":"98154566-e24f-4db6-8c88-5865be0b6155","Type":"ContainerDied","Data":"5d9b1a426d2b46fd612e20bbbc8117dfebcd86beae225fcf81f03527bab75c39"} Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.519742 4799 scope.go:117] "RemoveContainer" containerID="eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.519805 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vggct" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.559231 4799 scope.go:117] "RemoveContainer" containerID="93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.586979 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vggct"] Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.599047 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vggct"] Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.616031 4799 scope.go:117] "RemoveContainer" containerID="fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.667121 4799 scope.go:117] "RemoveContainer" containerID="eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7" Mar 19 20:52:25 crc kubenswrapper[4799]: E0319 20:52:25.667600 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7\": container with ID starting with eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7 not found: ID does not exist" containerID="eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.667642 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7"} err="failed to get container status \"eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7\": rpc error: code = NotFound desc = could not find container \"eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7\": container with ID starting with eb685b087c8b6f436ada0b7ac14c5a704813d13d454f697690864e022bcf81a7 not found: ID does not exist" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.667670 4799 scope.go:117] "RemoveContainer" containerID="93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7" Mar 19 20:52:25 crc kubenswrapper[4799]: E0319 20:52:25.668123 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7\": container with ID starting with 93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7 not found: ID does not exist" containerID="93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.668162 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7"} err="failed to get container status \"93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7\": rpc error: code = NotFound desc = could not find container \"93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7\": container with ID starting with 93f7c3608e8b4e70c879e2a13c74f403995e6e42264591d22c7b936ae4677db7 not found: ID does not exist" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.668190 4799 scope.go:117] "RemoveContainer" containerID="fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2" Mar 19 20:52:25 crc kubenswrapper[4799]: E0319 20:52:25.668489 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2\": container with ID starting with fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2 not found: ID does not exist" containerID="fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2" Mar 19 20:52:25 crc kubenswrapper[4799]: I0319 20:52:25.668651 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2"} err="failed to get container status \"fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2\": rpc error: code = NotFound desc = could not find container \"fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2\": container with ID starting with fbeb15accf45d54b219a6a880c29ef964823cc387f621da918d2ae683fd387d2 not found: ID does not exist" Mar 19 20:52:27 crc kubenswrapper[4799]: I0319 20:52:27.136242 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98154566-e24f-4db6-8c88-5865be0b6155" path="/var/lib/kubelet/pods/98154566-e24f-4db6-8c88-5865be0b6155/volumes" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.170106 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565894-6nf54"] Mar 19 20:54:00 crc kubenswrapper[4799]: E0319 20:54:00.171532 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="extract-content" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.171554 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="extract-content" Mar 19 20:54:00 crc kubenswrapper[4799]: E0319 20:54:00.171582 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="extract-utilities" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.171618 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="extract-utilities" Mar 19 20:54:00 crc kubenswrapper[4799]: E0319 20:54:00.171676 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="registry-server" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.171689 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="registry-server" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.172017 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="98154566-e24f-4db6-8c88-5865be0b6155" containerName="registry-server" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.173117 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.178697 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.179186 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.179539 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.196084 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565894-6nf54"] Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.228900 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qhc\" (UniqueName: \"kubernetes.io/projected/4be40fe5-fb7f-45c0-b312-30abdc732828-kube-api-access-n5qhc\") pod \"auto-csr-approver-29565894-6nf54\" (UID: \"4be40fe5-fb7f-45c0-b312-30abdc732828\") " pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.330335 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qhc\" (UniqueName: \"kubernetes.io/projected/4be40fe5-fb7f-45c0-b312-30abdc732828-kube-api-access-n5qhc\") pod \"auto-csr-approver-29565894-6nf54\" (UID: \"4be40fe5-fb7f-45c0-b312-30abdc732828\") " pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.351673 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qhc\" (UniqueName: \"kubernetes.io/projected/4be40fe5-fb7f-45c0-b312-30abdc732828-kube-api-access-n5qhc\") pod \"auto-csr-approver-29565894-6nf54\" (UID: \"4be40fe5-fb7f-45c0-b312-30abdc732828\") " pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:00 crc kubenswrapper[4799]: I0319 20:54:00.547545 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:01 crc kubenswrapper[4799]: I0319 20:54:01.093434 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565894-6nf54"] Mar 19 20:54:01 crc kubenswrapper[4799]: I0319 20:54:01.109783 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:54:01 crc kubenswrapper[4799]: I0319 20:54:01.590927 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565894-6nf54" event={"ID":"4be40fe5-fb7f-45c0-b312-30abdc732828","Type":"ContainerStarted","Data":"6321cc8d341d2e10800e37b11a42adfb9531e688a5cb720ebd6489267987c4c0"} Mar 19 20:54:02 crc kubenswrapper[4799]: I0319 20:54:02.602227 4799 generic.go:334] "Generic (PLEG): container finished" podID="4be40fe5-fb7f-45c0-b312-30abdc732828" containerID="80f2ed4df3f02d5343b74092621b46a2cadb33f009a1b3f5079c774694ed6edd" exitCode=0 Mar 19 20:54:02 crc kubenswrapper[4799]: I0319 20:54:02.602450 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565894-6nf54" event={"ID":"4be40fe5-fb7f-45c0-b312-30abdc732828","Type":"ContainerDied","Data":"80f2ed4df3f02d5343b74092621b46a2cadb33f009a1b3f5079c774694ed6edd"} Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.098897 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.209400 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5qhc\" (UniqueName: \"kubernetes.io/projected/4be40fe5-fb7f-45c0-b312-30abdc732828-kube-api-access-n5qhc\") pod \"4be40fe5-fb7f-45c0-b312-30abdc732828\" (UID: \"4be40fe5-fb7f-45c0-b312-30abdc732828\") " Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.216689 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be40fe5-fb7f-45c0-b312-30abdc732828-kube-api-access-n5qhc" (OuterVolumeSpecName: "kube-api-access-n5qhc") pod "4be40fe5-fb7f-45c0-b312-30abdc732828" (UID: "4be40fe5-fb7f-45c0-b312-30abdc732828"). InnerVolumeSpecName "kube-api-access-n5qhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.311497 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5qhc\" (UniqueName: \"kubernetes.io/projected/4be40fe5-fb7f-45c0-b312-30abdc732828-kube-api-access-n5qhc\") on node \"crc\" DevicePath \"\"" Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.629951 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565894-6nf54" event={"ID":"4be40fe5-fb7f-45c0-b312-30abdc732828","Type":"ContainerDied","Data":"6321cc8d341d2e10800e37b11a42adfb9531e688a5cb720ebd6489267987c4c0"} Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.630338 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6321cc8d341d2e10800e37b11a42adfb9531e688a5cb720ebd6489267987c4c0" Mar 19 20:54:04 crc kubenswrapper[4799]: I0319 20:54:04.630050 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565894-6nf54" Mar 19 20:54:05 crc kubenswrapper[4799]: I0319 20:54:05.213752 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565888-dmgh6"] Mar 19 20:54:05 crc kubenswrapper[4799]: I0319 20:54:05.225070 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565888-dmgh6"] Mar 19 20:54:07 crc kubenswrapper[4799]: I0319 20:54:07.136328 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8711971f-7512-4f38-a0b8-c3a11e9c2245" path="/var/lib/kubelet/pods/8711971f-7512-4f38-a0b8-c3a11e9c2245/volumes" Mar 19 20:54:21 crc kubenswrapper[4799]: I0319 20:54:21.004529 4799 scope.go:117] "RemoveContainer" containerID="81521428c9a43af2087f1d71fa542ecb1e21cba0f6a3e40525aa0a96648cb14f" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.006466 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w6jjc"] Mar 19 20:54:22 crc kubenswrapper[4799]: E0319 20:54:22.007260 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be40fe5-fb7f-45c0-b312-30abdc732828" containerName="oc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.007276 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be40fe5-fb7f-45c0-b312-30abdc732828" containerName="oc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.007537 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be40fe5-fb7f-45c0-b312-30abdc732828" containerName="oc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.009191 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.023335 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6jjc"] Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.199126 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-utilities\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.199188 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5tfz\" (UniqueName: \"kubernetes.io/projected/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-kube-api-access-x5tfz\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.199229 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-catalog-content\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.302969 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-utilities\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.303048 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5tfz\" (UniqueName: \"kubernetes.io/projected/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-kube-api-access-x5tfz\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.303156 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-catalog-content\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.304283 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-catalog-content\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.304805 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-utilities\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.344097 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5tfz\" (UniqueName: \"kubernetes.io/projected/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-kube-api-access-x5tfz\") pod \"redhat-marketplace-w6jjc\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:22 crc kubenswrapper[4799]: I0319 20:54:22.628464 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:23 crc kubenswrapper[4799]: I0319 20:54:23.112607 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6jjc"] Mar 19 20:54:23 crc kubenswrapper[4799]: I0319 20:54:23.831202 4799 generic.go:334] "Generic (PLEG): container finished" podID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerID="93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9" exitCode=0 Mar 19 20:54:23 crc kubenswrapper[4799]: I0319 20:54:23.831254 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerDied","Data":"93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9"} Mar 19 20:54:23 crc kubenswrapper[4799]: I0319 20:54:23.831285 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerStarted","Data":"ae74c2a9516961a9d49e554f2c2a9ca3fd1dfcb65b10c6eaad51ea5729f28754"} Mar 19 20:54:24 crc kubenswrapper[4799]: I0319 20:54:24.842278 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerStarted","Data":"2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b"} Mar 19 20:54:25 crc kubenswrapper[4799]: I0319 20:54:25.854190 4799 generic.go:334] "Generic (PLEG): container finished" podID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerID="2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b" exitCode=0 Mar 19 20:54:25 crc kubenswrapper[4799]: I0319 20:54:25.854285 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerDied","Data":"2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b"} Mar 19 20:54:26 crc kubenswrapper[4799]: I0319 20:54:26.864262 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerStarted","Data":"1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac"} Mar 19 20:54:26 crc kubenswrapper[4799]: I0319 20:54:26.895210 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w6jjc" podStartSLOduration=3.330994988 podStartE2EDuration="5.895191696s" podCreationTimestamp="2026-03-19 20:54:21 +0000 UTC" firstStartedPulling="2026-03-19 20:54:23.833469645 +0000 UTC m=+2941.439422717" lastFinishedPulling="2026-03-19 20:54:26.397666343 +0000 UTC m=+2944.003619425" observedRunningTime="2026-03-19 20:54:26.889797185 +0000 UTC m=+2944.495750257" watchObservedRunningTime="2026-03-19 20:54:26.895191696 +0000 UTC m=+2944.501144768" Mar 19 20:54:28 crc kubenswrapper[4799]: I0319 20:54:28.755883 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:54:28 crc kubenswrapper[4799]: I0319 20:54:28.756146 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:54:32 crc kubenswrapper[4799]: I0319 20:54:32.628895 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:32 crc kubenswrapper[4799]: I0319 20:54:32.629581 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:32 crc kubenswrapper[4799]: I0319 20:54:32.725519 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:33 crc kubenswrapper[4799]: I0319 20:54:33.003748 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:33 crc kubenswrapper[4799]: I0319 20:54:33.068653 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6jjc"] Mar 19 20:54:34 crc kubenswrapper[4799]: I0319 20:54:34.966071 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w6jjc" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="registry-server" containerID="cri-o://1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac" gracePeriod=2 Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.476432 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.604817 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-utilities\") pod \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.604886 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-catalog-content\") pod \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.605196 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5tfz\" (UniqueName: \"kubernetes.io/projected/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-kube-api-access-x5tfz\") pod \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\" (UID: \"a87dcf61-76ff-46c7-bd61-5e25f725f6f1\") " Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.605797 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-utilities" (OuterVolumeSpecName: "utilities") pod "a87dcf61-76ff-46c7-bd61-5e25f725f6f1" (UID: "a87dcf61-76ff-46c7-bd61-5e25f725f6f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.613474 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-kube-api-access-x5tfz" (OuterVolumeSpecName: "kube-api-access-x5tfz") pod "a87dcf61-76ff-46c7-bd61-5e25f725f6f1" (UID: "a87dcf61-76ff-46c7-bd61-5e25f725f6f1"). InnerVolumeSpecName "kube-api-access-x5tfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.648512 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a87dcf61-76ff-46c7-bd61-5e25f725f6f1" (UID: "a87dcf61-76ff-46c7-bd61-5e25f725f6f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.707003 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5tfz\" (UniqueName: \"kubernetes.io/projected/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-kube-api-access-x5tfz\") on node \"crc\" DevicePath \"\"" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.707050 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.707064 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a87dcf61-76ff-46c7-bd61-5e25f725f6f1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.981988 4799 generic.go:334] "Generic (PLEG): container finished" podID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerID="1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac" exitCode=0 Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.982032 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerDied","Data":"1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac"} Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.982078 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6jjc" event={"ID":"a87dcf61-76ff-46c7-bd61-5e25f725f6f1","Type":"ContainerDied","Data":"ae74c2a9516961a9d49e554f2c2a9ca3fd1dfcb65b10c6eaad51ea5729f28754"} Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.982096 4799 scope.go:117] "RemoveContainer" containerID="1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac" Mar 19 20:54:35 crc kubenswrapper[4799]: I0319 20:54:35.983064 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6jjc" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.016254 4799 scope.go:117] "RemoveContainer" containerID="2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.032507 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6jjc"] Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.041919 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6jjc"] Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.055248 4799 scope.go:117] "RemoveContainer" containerID="93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.144035 4799 scope.go:117] "RemoveContainer" containerID="1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac" Mar 19 20:54:36 crc kubenswrapper[4799]: E0319 20:54:36.147796 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac\": container with ID starting with 1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac not found: ID does not exist" containerID="1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.147841 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac"} err="failed to get container status \"1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac\": rpc error: code = NotFound desc = could not find container \"1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac\": container with ID starting with 1ac087f49721caa19930ff07599c429a5faac234263d515538a2313da360e6ac not found: ID does not exist" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.147871 4799 scope.go:117] "RemoveContainer" containerID="2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b" Mar 19 20:54:36 crc kubenswrapper[4799]: E0319 20:54:36.152842 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b\": container with ID starting with 2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b not found: ID does not exist" containerID="2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.152891 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b"} err="failed to get container status \"2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b\": rpc error: code = NotFound desc = could not find container \"2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b\": container with ID starting with 2db02747d5d50f5a70ec78320ca7a47c109a8e37ba6d9d118ec0ccc6ebd1773b not found: ID does not exist" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.152919 4799 scope.go:117] "RemoveContainer" containerID="93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9" Mar 19 20:54:36 crc kubenswrapper[4799]: E0319 20:54:36.160899 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9\": container with ID starting with 93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9 not found: ID does not exist" containerID="93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9" Mar 19 20:54:36 crc kubenswrapper[4799]: I0319 20:54:36.160949 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9"} err="failed to get container status \"93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9\": rpc error: code = NotFound desc = could not find container \"93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9\": container with ID starting with 93d33cc9b11227f9153c69174b86cfc82980ef47351c913e2807f49dc37238e9 not found: ID does not exist" Mar 19 20:54:37 crc kubenswrapper[4799]: I0319 20:54:37.129685 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" path="/var/lib/kubelet/pods/a87dcf61-76ff-46c7-bd61-5e25f725f6f1/volumes" Mar 19 20:54:58 crc kubenswrapper[4799]: I0319 20:54:58.756552 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:54:58 crc kubenswrapper[4799]: I0319 20:54:58.757212 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:55:28 crc kubenswrapper[4799]: I0319 20:55:28.756080 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:55:28 crc kubenswrapper[4799]: I0319 20:55:28.756574 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:55:28 crc kubenswrapper[4799]: I0319 20:55:28.756622 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:55:28 crc kubenswrapper[4799]: I0319 20:55:28.757210 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"babe0dde2e8ae616818bc758cddcd4e3f5c5c20e0c622347f9b7f7191cc4b383"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:55:28 crc kubenswrapper[4799]: I0319 20:55:28.757252 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://babe0dde2e8ae616818bc758cddcd4e3f5c5c20e0c622347f9b7f7191cc4b383" gracePeriod=600 Mar 19 20:55:29 crc kubenswrapper[4799]: I0319 20:55:29.516358 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="babe0dde2e8ae616818bc758cddcd4e3f5c5c20e0c622347f9b7f7191cc4b383" exitCode=0 Mar 19 20:55:29 crc kubenswrapper[4799]: I0319 20:55:29.516439 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"babe0dde2e8ae616818bc758cddcd4e3f5c5c20e0c622347f9b7f7191cc4b383"} Mar 19 20:55:29 crc kubenswrapper[4799]: I0319 20:55:29.517036 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365"} Mar 19 20:55:29 crc kubenswrapper[4799]: I0319 20:55:29.517073 4799 scope.go:117] "RemoveContainer" containerID="14ca93db5af0e6e92f30ef6d8d932feda3ee7e1ab4f10d4ddab604db4830de79" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.159058 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565896-pzvm6"] Mar 19 20:56:00 crc kubenswrapper[4799]: E0319 20:56:00.160271 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="extract-content" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.160296 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="extract-content" Mar 19 20:56:00 crc kubenswrapper[4799]: E0319 20:56:00.160321 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="extract-utilities" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.160332 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="extract-utilities" Mar 19 20:56:00 crc kubenswrapper[4799]: E0319 20:56:00.160351 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="registry-server" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.160364 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="registry-server" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.160681 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87dcf61-76ff-46c7-bd61-5e25f725f6f1" containerName="registry-server" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.161574 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.164657 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.164783 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.166369 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.172219 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565896-pzvm6"] Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.329865 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtt8\" (UniqueName: \"kubernetes.io/projected/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd-kube-api-access-lqtt8\") pod \"auto-csr-approver-29565896-pzvm6\" (UID: \"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd\") " pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.432284 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtt8\" (UniqueName: \"kubernetes.io/projected/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd-kube-api-access-lqtt8\") pod \"auto-csr-approver-29565896-pzvm6\" (UID: \"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd\") " pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.451257 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtt8\" (UniqueName: \"kubernetes.io/projected/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd-kube-api-access-lqtt8\") pod \"auto-csr-approver-29565896-pzvm6\" (UID: \"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd\") " pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.485541 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:00 crc kubenswrapper[4799]: I0319 20:56:00.984250 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565896-pzvm6"] Mar 19 20:56:01 crc kubenswrapper[4799]: I0319 20:56:01.900235 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" event={"ID":"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd","Type":"ContainerStarted","Data":"2a84076b1445effdf3b2df2282245e33ceba241c25945cf0850c27222b9e5171"} Mar 19 20:56:03 crc kubenswrapper[4799]: I0319 20:56:03.924677 4799 generic.go:334] "Generic (PLEG): container finished" podID="e43eb2d4-4e9e-4d8c-a149-731e1a0431dd" containerID="b84c2eaa9ebbde8cae13e91dc50a7741979ccb7a9ceff1a910b03101170653c6" exitCode=0 Mar 19 20:56:03 crc kubenswrapper[4799]: I0319 20:56:03.924788 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" event={"ID":"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd","Type":"ContainerDied","Data":"b84c2eaa9ebbde8cae13e91dc50a7741979ccb7a9ceff1a910b03101170653c6"} Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.398710 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.549111 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqtt8\" (UniqueName: \"kubernetes.io/projected/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd-kube-api-access-lqtt8\") pod \"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd\" (UID: \"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd\") " Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.554827 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd-kube-api-access-lqtt8" (OuterVolumeSpecName: "kube-api-access-lqtt8") pod "e43eb2d4-4e9e-4d8c-a149-731e1a0431dd" (UID: "e43eb2d4-4e9e-4d8c-a149-731e1a0431dd"). InnerVolumeSpecName "kube-api-access-lqtt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.652181 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqtt8\" (UniqueName: \"kubernetes.io/projected/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd-kube-api-access-lqtt8\") on node \"crc\" DevicePath \"\"" Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.945435 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" event={"ID":"e43eb2d4-4e9e-4d8c-a149-731e1a0431dd","Type":"ContainerDied","Data":"2a84076b1445effdf3b2df2282245e33ceba241c25945cf0850c27222b9e5171"} Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.945491 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a84076b1445effdf3b2df2282245e33ceba241c25945cf0850c27222b9e5171" Mar 19 20:56:05 crc kubenswrapper[4799]: I0319 20:56:05.945525 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565896-pzvm6" Mar 19 20:56:06 crc kubenswrapper[4799]: I0319 20:56:06.471022 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565890-dtft2"] Mar 19 20:56:06 crc kubenswrapper[4799]: I0319 20:56:06.491287 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565890-dtft2"] Mar 19 20:56:07 crc kubenswrapper[4799]: I0319 20:56:07.129767 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943724bd-8de6-49cd-b029-ca39c51b4444" path="/var/lib/kubelet/pods/943724bd-8de6-49cd-b029-ca39c51b4444/volumes" Mar 19 20:56:21 crc kubenswrapper[4799]: I0319 20:56:21.138886 4799 scope.go:117] "RemoveContainer" containerID="cb678004ef9e2ecb1e344d84b09bc75eb5c686425163799338f912070ed43ee1" Mar 19 20:57:22 crc kubenswrapper[4799]: I0319 20:57:22.905947 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zqcgc"] Mar 19 20:57:22 crc kubenswrapper[4799]: E0319 20:57:22.907838 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43eb2d4-4e9e-4d8c-a149-731e1a0431dd" containerName="oc" Mar 19 20:57:22 crc kubenswrapper[4799]: I0319 20:57:22.909472 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43eb2d4-4e9e-4d8c-a149-731e1a0431dd" containerName="oc" Mar 19 20:57:22 crc kubenswrapper[4799]: I0319 20:57:22.910115 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43eb2d4-4e9e-4d8c-a149-731e1a0431dd" containerName="oc" Mar 19 20:57:22 crc kubenswrapper[4799]: I0319 20:57:22.911803 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:22 crc kubenswrapper[4799]: I0319 20:57:22.924209 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zqcgc"] Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.033741 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-utilities\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.033898 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thvt9\" (UniqueName: \"kubernetes.io/projected/813c7ca9-e89f-4a22-b099-b1184a1fa141-kube-api-access-thvt9\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.033993 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-catalog-content\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.136004 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-utilities\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.136412 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thvt9\" (UniqueName: \"kubernetes.io/projected/813c7ca9-e89f-4a22-b099-b1184a1fa141-kube-api-access-thvt9\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.136538 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-catalog-content\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.137906 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-utilities\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.138671 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-catalog-content\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.165564 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thvt9\" (UniqueName: \"kubernetes.io/projected/813c7ca9-e89f-4a22-b099-b1184a1fa141-kube-api-access-thvt9\") pod \"community-operators-zqcgc\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.253344 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.776196 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zqcgc"] Mar 19 20:57:23 crc kubenswrapper[4799]: I0319 20:57:23.829253 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerStarted","Data":"3c9b7ae0bf8889b0b09f9194ad988b955bb0e762b11e7c8790a14bf469331f7b"} Mar 19 20:57:24 crc kubenswrapper[4799]: I0319 20:57:24.844889 4799 generic.go:334] "Generic (PLEG): container finished" podID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerID="b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412" exitCode=0 Mar 19 20:57:24 crc kubenswrapper[4799]: I0319 20:57:24.845421 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerDied","Data":"b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412"} Mar 19 20:57:26 crc kubenswrapper[4799]: I0319 20:57:26.872256 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerStarted","Data":"59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243"} Mar 19 20:57:27 crc kubenswrapper[4799]: I0319 20:57:27.887900 4799 generic.go:334] "Generic (PLEG): container finished" podID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerID="59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243" exitCode=0 Mar 19 20:57:27 crc kubenswrapper[4799]: I0319 20:57:27.888012 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerDied","Data":"59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243"} Mar 19 20:57:28 crc kubenswrapper[4799]: I0319 20:57:28.902197 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerStarted","Data":"a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe"} Mar 19 20:57:28 crc kubenswrapper[4799]: I0319 20:57:28.937731 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zqcgc" podStartSLOduration=3.445308262 podStartE2EDuration="6.937709723s" podCreationTimestamp="2026-03-19 20:57:22 +0000 UTC" firstStartedPulling="2026-03-19 20:57:24.848833761 +0000 UTC m=+3122.454786873" lastFinishedPulling="2026-03-19 20:57:28.341235262 +0000 UTC m=+3125.947188334" observedRunningTime="2026-03-19 20:57:28.935077629 +0000 UTC m=+3126.541030711" watchObservedRunningTime="2026-03-19 20:57:28.937709723 +0000 UTC m=+3126.543662795" Mar 19 20:57:33 crc kubenswrapper[4799]: I0319 20:57:33.254587 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:33 crc kubenswrapper[4799]: I0319 20:57:33.255282 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:33 crc kubenswrapper[4799]: I0319 20:57:33.320419 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:34 crc kubenswrapper[4799]: I0319 20:57:34.018595 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:35 crc kubenswrapper[4799]: I0319 20:57:35.494814 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zqcgc"] Mar 19 20:57:35 crc kubenswrapper[4799]: I0319 20:57:35.982121 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zqcgc" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="registry-server" containerID="cri-o://a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe" gracePeriod=2 Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.604719 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.679989 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-catalog-content\") pod \"813c7ca9-e89f-4a22-b099-b1184a1fa141\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.680164 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-utilities\") pod \"813c7ca9-e89f-4a22-b099-b1184a1fa141\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.680203 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thvt9\" (UniqueName: \"kubernetes.io/projected/813c7ca9-e89f-4a22-b099-b1184a1fa141-kube-api-access-thvt9\") pod \"813c7ca9-e89f-4a22-b099-b1184a1fa141\" (UID: \"813c7ca9-e89f-4a22-b099-b1184a1fa141\") " Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.683136 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-utilities" (OuterVolumeSpecName: "utilities") pod "813c7ca9-e89f-4a22-b099-b1184a1fa141" (UID: "813c7ca9-e89f-4a22-b099-b1184a1fa141"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.694057 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813c7ca9-e89f-4a22-b099-b1184a1fa141-kube-api-access-thvt9" (OuterVolumeSpecName: "kube-api-access-thvt9") pod "813c7ca9-e89f-4a22-b099-b1184a1fa141" (UID: "813c7ca9-e89f-4a22-b099-b1184a1fa141"). InnerVolumeSpecName "kube-api-access-thvt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.754312 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "813c7ca9-e89f-4a22-b099-b1184a1fa141" (UID: "813c7ca9-e89f-4a22-b099-b1184a1fa141"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.783663 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.784092 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/813c7ca9-e89f-4a22-b099-b1184a1fa141-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.784113 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thvt9\" (UniqueName: \"kubernetes.io/projected/813c7ca9-e89f-4a22-b099-b1184a1fa141-kube-api-access-thvt9\") on node \"crc\" DevicePath \"\"" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.997283 4799 generic.go:334] "Generic (PLEG): container finished" podID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerID="a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe" exitCode=0 Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.997350 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerDied","Data":"a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe"} Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.997362 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqcgc" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.997472 4799 scope.go:117] "RemoveContainer" containerID="a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe" Mar 19 20:57:36 crc kubenswrapper[4799]: I0319 20:57:36.997451 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqcgc" event={"ID":"813c7ca9-e89f-4a22-b099-b1184a1fa141","Type":"ContainerDied","Data":"3c9b7ae0bf8889b0b09f9194ad988b955bb0e762b11e7c8790a14bf469331f7b"} Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.028333 4799 scope.go:117] "RemoveContainer" containerID="59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.049500 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zqcgc"] Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.071238 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zqcgc"] Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.082176 4799 scope.go:117] "RemoveContainer" containerID="b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.129926 4799 scope.go:117] "RemoveContainer" containerID="a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe" Mar 19 20:57:37 crc kubenswrapper[4799]: E0319 20:57:37.130458 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe\": container with ID starting with a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe not found: ID does not exist" containerID="a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.130512 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe"} err="failed to get container status \"a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe\": rpc error: code = NotFound desc = could not find container \"a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe\": container with ID starting with a4ffe2b3d4aa748ab0b9fa50a8efdf83017531eec5003e1f1ed666818e47aefe not found: ID does not exist" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.130550 4799 scope.go:117] "RemoveContainer" containerID="59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243" Mar 19 20:57:37 crc kubenswrapper[4799]: E0319 20:57:37.130985 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243\": container with ID starting with 59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243 not found: ID does not exist" containerID="59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.131150 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243"} err="failed to get container status \"59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243\": rpc error: code = NotFound desc = could not find container \"59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243\": container with ID starting with 59b87d2b6532b3bf6ffeb2eb66ef5561a12c476a06f01a703294dc44e577e243 not found: ID does not exist" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.131289 4799 scope.go:117] "RemoveContainer" containerID="b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412" Mar 19 20:57:37 crc kubenswrapper[4799]: E0319 20:57:37.132155 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412\": container with ID starting with b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412 not found: ID does not exist" containerID="b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.132193 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412"} err="failed to get container status \"b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412\": rpc error: code = NotFound desc = could not find container \"b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412\": container with ID starting with b2385a43c6f4341d0640af8d41b4fd7b6bd698a2e3937bd1ed279719bdc7a412 not found: ID does not exist" Mar 19 20:57:37 crc kubenswrapper[4799]: I0319 20:57:37.135854 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" path="/var/lib/kubelet/pods/813c7ca9-e89f-4a22-b099-b1184a1fa141/volumes" Mar 19 20:57:58 crc kubenswrapper[4799]: I0319 20:57:58.756104 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:57:58 crc kubenswrapper[4799]: I0319 20:57:58.756546 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.153873 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565898-vcv8t"] Mar 19 20:58:00 crc kubenswrapper[4799]: E0319 20:58:00.154854 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="registry-server" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.154878 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="registry-server" Mar 19 20:58:00 crc kubenswrapper[4799]: E0319 20:58:00.154901 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="extract-content" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.154914 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="extract-content" Mar 19 20:58:00 crc kubenswrapper[4799]: E0319 20:58:00.154934 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="extract-utilities" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.154945 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="extract-utilities" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.155283 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="813c7ca9-e89f-4a22-b099-b1184a1fa141" containerName="registry-server" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.156180 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.159157 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.160188 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.160343 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.216367 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565898-vcv8t"] Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.310601 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slnpc\" (UniqueName: \"kubernetes.io/projected/4948af39-6193-464c-8a26-94fe04ea30d6-kube-api-access-slnpc\") pod \"auto-csr-approver-29565898-vcv8t\" (UID: \"4948af39-6193-464c-8a26-94fe04ea30d6\") " pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.413015 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slnpc\" (UniqueName: \"kubernetes.io/projected/4948af39-6193-464c-8a26-94fe04ea30d6-kube-api-access-slnpc\") pod \"auto-csr-approver-29565898-vcv8t\" (UID: \"4948af39-6193-464c-8a26-94fe04ea30d6\") " pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.434784 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slnpc\" (UniqueName: \"kubernetes.io/projected/4948af39-6193-464c-8a26-94fe04ea30d6-kube-api-access-slnpc\") pod \"auto-csr-approver-29565898-vcv8t\" (UID: \"4948af39-6193-464c-8a26-94fe04ea30d6\") " pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.528173 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:00 crc kubenswrapper[4799]: I0319 20:58:00.975211 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565898-vcv8t"] Mar 19 20:58:01 crc kubenswrapper[4799]: I0319 20:58:01.285677 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" event={"ID":"4948af39-6193-464c-8a26-94fe04ea30d6","Type":"ContainerStarted","Data":"daacfa797e8905e229e05c38eb1696738395813c6e3145b979ede7607030a44d"} Mar 19 20:58:02 crc kubenswrapper[4799]: I0319 20:58:02.301245 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" event={"ID":"4948af39-6193-464c-8a26-94fe04ea30d6","Type":"ContainerStarted","Data":"bfe3e4a0377ff1424dcfe0532b81f80d51f7606cace805d2cd3ad95d7aba3491"} Mar 19 20:58:02 crc kubenswrapper[4799]: I0319 20:58:02.324024 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" podStartSLOduration=1.424351849 podStartE2EDuration="2.324005059s" podCreationTimestamp="2026-03-19 20:58:00 +0000 UTC" firstStartedPulling="2026-03-19 20:58:00.978241904 +0000 UTC m=+3158.584195006" lastFinishedPulling="2026-03-19 20:58:01.877895134 +0000 UTC m=+3159.483848216" observedRunningTime="2026-03-19 20:58:02.316604862 +0000 UTC m=+3159.922557934" watchObservedRunningTime="2026-03-19 20:58:02.324005059 +0000 UTC m=+3159.929958131" Mar 19 20:58:03 crc kubenswrapper[4799]: I0319 20:58:03.318622 4799 generic.go:334] "Generic (PLEG): container finished" podID="4948af39-6193-464c-8a26-94fe04ea30d6" containerID="bfe3e4a0377ff1424dcfe0532b81f80d51f7606cace805d2cd3ad95d7aba3491" exitCode=0 Mar 19 20:58:03 crc kubenswrapper[4799]: I0319 20:58:03.318675 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" event={"ID":"4948af39-6193-464c-8a26-94fe04ea30d6","Type":"ContainerDied","Data":"bfe3e4a0377ff1424dcfe0532b81f80d51f7606cace805d2cd3ad95d7aba3491"} Mar 19 20:58:04 crc kubenswrapper[4799]: I0319 20:58:04.636613 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:04 crc kubenswrapper[4799]: I0319 20:58:04.808995 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slnpc\" (UniqueName: \"kubernetes.io/projected/4948af39-6193-464c-8a26-94fe04ea30d6-kube-api-access-slnpc\") pod \"4948af39-6193-464c-8a26-94fe04ea30d6\" (UID: \"4948af39-6193-464c-8a26-94fe04ea30d6\") " Mar 19 20:58:04 crc kubenswrapper[4799]: I0319 20:58:04.817142 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4948af39-6193-464c-8a26-94fe04ea30d6-kube-api-access-slnpc" (OuterVolumeSpecName: "kube-api-access-slnpc") pod "4948af39-6193-464c-8a26-94fe04ea30d6" (UID: "4948af39-6193-464c-8a26-94fe04ea30d6"). InnerVolumeSpecName "kube-api-access-slnpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:58:04 crc kubenswrapper[4799]: I0319 20:58:04.912618 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slnpc\" (UniqueName: \"kubernetes.io/projected/4948af39-6193-464c-8a26-94fe04ea30d6-kube-api-access-slnpc\") on node \"crc\" DevicePath \"\"" Mar 19 20:58:05 crc kubenswrapper[4799]: I0319 20:58:05.341984 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" event={"ID":"4948af39-6193-464c-8a26-94fe04ea30d6","Type":"ContainerDied","Data":"daacfa797e8905e229e05c38eb1696738395813c6e3145b979ede7607030a44d"} Mar 19 20:58:05 crc kubenswrapper[4799]: I0319 20:58:05.342056 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565898-vcv8t" Mar 19 20:58:05 crc kubenswrapper[4799]: I0319 20:58:05.342065 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daacfa797e8905e229e05c38eb1696738395813c6e3145b979ede7607030a44d" Mar 19 20:58:05 crc kubenswrapper[4799]: I0319 20:58:05.461436 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565892-fgq5j"] Mar 19 20:58:05 crc kubenswrapper[4799]: I0319 20:58:05.471621 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565892-fgq5j"] Mar 19 20:58:07 crc kubenswrapper[4799]: I0319 20:58:07.137157 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f8c752-4955-4d9e-afc3-0163858f464b" path="/var/lib/kubelet/pods/d8f8c752-4955-4d9e-afc3-0163858f464b/volumes" Mar 19 20:58:21 crc kubenswrapper[4799]: I0319 20:58:21.292233 4799 scope.go:117] "RemoveContainer" containerID="f01a47bd98e0bf0ae6328cb420f1700edfca58c1557266df7c91957ae9690a2b" Mar 19 20:58:28 crc kubenswrapper[4799]: I0319 20:58:28.755961 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:58:28 crc kubenswrapper[4799]: I0319 20:58:28.756794 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.755980 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.756632 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.756683 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.757557 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.757631 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" gracePeriod=600 Mar 19 20:58:58 crc kubenswrapper[4799]: E0319 20:58:58.901718 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.923759 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" exitCode=0 Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.923803 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365"} Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.923842 4799 scope.go:117] "RemoveContainer" containerID="babe0dde2e8ae616818bc758cddcd4e3f5c5c20e0c622347f9b7f7191cc4b383" Mar 19 20:58:58 crc kubenswrapper[4799]: I0319 20:58:58.924908 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 20:58:58 crc kubenswrapper[4799]: E0319 20:58:58.925537 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:59:13 crc kubenswrapper[4799]: I0319 20:59:13.137026 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 20:59:13 crc kubenswrapper[4799]: E0319 20:59:13.138052 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:59:28 crc kubenswrapper[4799]: I0319 20:59:28.117166 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 20:59:28 crc kubenswrapper[4799]: E0319 20:59:28.117961 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:59:43 crc kubenswrapper[4799]: I0319 20:59:43.136076 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 20:59:43 crc kubenswrapper[4799]: E0319 20:59:43.137166 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 20:59:55 crc kubenswrapper[4799]: I0319 20:59:55.116809 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 20:59:55 crc kubenswrapper[4799]: E0319 20:59:55.118350 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.172209 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565900-dw4qs"] Mar 19 21:00:00 crc kubenswrapper[4799]: E0319 21:00:00.173542 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4948af39-6193-464c-8a26-94fe04ea30d6" containerName="oc" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.173564 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="4948af39-6193-464c-8a26-94fe04ea30d6" containerName="oc" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.173889 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="4948af39-6193-464c-8a26-94fe04ea30d6" containerName="oc" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.174888 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.204784 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w"] Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.206128 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565900-dw4qs"] Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.206149 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w"] Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.206227 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.209823 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.210104 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.210223 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.210696 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.211448 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.363631 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/395dc3d1-aa64-48b9-acc0-eec818c4a73b-secret-volume\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.363682 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/395dc3d1-aa64-48b9-acc0-eec818c4a73b-config-volume\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.363800 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwjg\" (UniqueName: \"kubernetes.io/projected/cb8e45b9-668d-422d-ae1c-2ca61c39401c-kube-api-access-ggwjg\") pod \"auto-csr-approver-29565900-dw4qs\" (UID: \"cb8e45b9-668d-422d-ae1c-2ca61c39401c\") " pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.363881 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvw7\" (UniqueName: \"kubernetes.io/projected/395dc3d1-aa64-48b9-acc0-eec818c4a73b-kube-api-access-2cvw7\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.465254 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/395dc3d1-aa64-48b9-acc0-eec818c4a73b-secret-volume\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.465308 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/395dc3d1-aa64-48b9-acc0-eec818c4a73b-config-volume\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.466164 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/395dc3d1-aa64-48b9-acc0-eec818c4a73b-config-volume\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.466228 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwjg\" (UniqueName: \"kubernetes.io/projected/cb8e45b9-668d-422d-ae1c-2ca61c39401c-kube-api-access-ggwjg\") pod \"auto-csr-approver-29565900-dw4qs\" (UID: \"cb8e45b9-668d-422d-ae1c-2ca61c39401c\") " pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.466618 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvw7\" (UniqueName: \"kubernetes.io/projected/395dc3d1-aa64-48b9-acc0-eec818c4a73b-kube-api-access-2cvw7\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.480799 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/395dc3d1-aa64-48b9-acc0-eec818c4a73b-secret-volume\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.485146 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvw7\" (UniqueName: \"kubernetes.io/projected/395dc3d1-aa64-48b9-acc0-eec818c4a73b-kube-api-access-2cvw7\") pod \"collect-profiles-29565900-gkc5w\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.498441 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwjg\" (UniqueName: \"kubernetes.io/projected/cb8e45b9-668d-422d-ae1c-2ca61c39401c-kube-api-access-ggwjg\") pod \"auto-csr-approver-29565900-dw4qs\" (UID: \"cb8e45b9-668d-422d-ae1c-2ca61c39401c\") " pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.548215 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:00 crc kubenswrapper[4799]: I0319 21:00:00.558666 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.053184 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565900-dw4qs"] Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.066817 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.075975 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w"] Mar 19 21:00:01 crc kubenswrapper[4799]: W0319 21:00:01.076861 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395dc3d1_aa64_48b9_acc0_eec818c4a73b.slice/crio-fdd735a684733559c590280ca1800dadffacc3d5b450afcef8e616e6059e3f31 WatchSource:0}: Error finding container fdd735a684733559c590280ca1800dadffacc3d5b450afcef8e616e6059e3f31: Status 404 returned error can't find the container with id fdd735a684733559c590280ca1800dadffacc3d5b450afcef8e616e6059e3f31 Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.599856 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" event={"ID":"cb8e45b9-668d-422d-ae1c-2ca61c39401c","Type":"ContainerStarted","Data":"50b8e49b8ffa121698ac6f8d7f9e20a2cca63129095f05d08ff629288aced853"} Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.601539 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" event={"ID":"395dc3d1-aa64-48b9-acc0-eec818c4a73b","Type":"ContainerStarted","Data":"389a4acefc6e783c37e95b2efa23ee92ee50c21f0f60e2670a42cf613f9b0894"} Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.601581 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" event={"ID":"395dc3d1-aa64-48b9-acc0-eec818c4a73b","Type":"ContainerStarted","Data":"fdd735a684733559c590280ca1800dadffacc3d5b450afcef8e616e6059e3f31"} Mar 19 21:00:01 crc kubenswrapper[4799]: I0319 21:00:01.625866 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" podStartSLOduration=1.625842378 podStartE2EDuration="1.625842378s" podCreationTimestamp="2026-03-19 21:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:00:01.621116245 +0000 UTC m=+3279.227069337" watchObservedRunningTime="2026-03-19 21:00:01.625842378 +0000 UTC m=+3279.231795460" Mar 19 21:00:02 crc kubenswrapper[4799]: I0319 21:00:02.618803 4799 generic.go:334] "Generic (PLEG): container finished" podID="395dc3d1-aa64-48b9-acc0-eec818c4a73b" containerID="389a4acefc6e783c37e95b2efa23ee92ee50c21f0f60e2670a42cf613f9b0894" exitCode=0 Mar 19 21:00:02 crc kubenswrapper[4799]: I0319 21:00:02.618932 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" event={"ID":"395dc3d1-aa64-48b9-acc0-eec818c4a73b","Type":"ContainerDied","Data":"389a4acefc6e783c37e95b2efa23ee92ee50c21f0f60e2670a42cf613f9b0894"} Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.065845 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.153589 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/395dc3d1-aa64-48b9-acc0-eec818c4a73b-secret-volume\") pod \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.153677 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/395dc3d1-aa64-48b9-acc0-eec818c4a73b-config-volume\") pod \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.153844 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cvw7\" (UniqueName: \"kubernetes.io/projected/395dc3d1-aa64-48b9-acc0-eec818c4a73b-kube-api-access-2cvw7\") pod \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\" (UID: \"395dc3d1-aa64-48b9-acc0-eec818c4a73b\") " Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.154708 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395dc3d1-aa64-48b9-acc0-eec818c4a73b-config-volume" (OuterVolumeSpecName: "config-volume") pod "395dc3d1-aa64-48b9-acc0-eec818c4a73b" (UID: "395dc3d1-aa64-48b9-acc0-eec818c4a73b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.161680 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/395dc3d1-aa64-48b9-acc0-eec818c4a73b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "395dc3d1-aa64-48b9-acc0-eec818c4a73b" (UID: "395dc3d1-aa64-48b9-acc0-eec818c4a73b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.161748 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395dc3d1-aa64-48b9-acc0-eec818c4a73b-kube-api-access-2cvw7" (OuterVolumeSpecName: "kube-api-access-2cvw7") pod "395dc3d1-aa64-48b9-acc0-eec818c4a73b" (UID: "395dc3d1-aa64-48b9-acc0-eec818c4a73b"). InnerVolumeSpecName "kube-api-access-2cvw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.255414 4799 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/395dc3d1-aa64-48b9-acc0-eec818c4a73b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.255446 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/395dc3d1-aa64-48b9-acc0-eec818c4a73b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.255455 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cvw7\" (UniqueName: \"kubernetes.io/projected/395dc3d1-aa64-48b9-acc0-eec818c4a73b-kube-api-access-2cvw7\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.639945 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" event={"ID":"395dc3d1-aa64-48b9-acc0-eec818c4a73b","Type":"ContainerDied","Data":"fdd735a684733559c590280ca1800dadffacc3d5b450afcef8e616e6059e3f31"} Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.639980 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd735a684733559c590280ca1800dadffacc3d5b450afcef8e616e6059e3f31" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.640024 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565900-gkc5w" Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.707997 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk"] Mar 19 21:00:04 crc kubenswrapper[4799]: I0319 21:00:04.718723 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-dlpsk"] Mar 19 21:00:05 crc kubenswrapper[4799]: I0319 21:00:05.134037 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6313da4-6572-481f-888e-db433419606a" path="/var/lib/kubelet/pods/c6313da4-6572-481f-888e-db433419606a/volumes" Mar 19 21:00:05 crc kubenswrapper[4799]: I0319 21:00:05.658187 4799 generic.go:334] "Generic (PLEG): container finished" podID="cb8e45b9-668d-422d-ae1c-2ca61c39401c" containerID="03979ec3cad5ccac0e7d0c965d1a524b71a193e9f69bc619e906a545d25a1cd0" exitCode=0 Mar 19 21:00:05 crc kubenswrapper[4799]: I0319 21:00:05.658245 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" event={"ID":"cb8e45b9-668d-422d-ae1c-2ca61c39401c","Type":"ContainerDied","Data":"03979ec3cad5ccac0e7d0c965d1a524b71a193e9f69bc619e906a545d25a1cd0"} Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.145853 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.215596 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggwjg\" (UniqueName: \"kubernetes.io/projected/cb8e45b9-668d-422d-ae1c-2ca61c39401c-kube-api-access-ggwjg\") pod \"cb8e45b9-668d-422d-ae1c-2ca61c39401c\" (UID: \"cb8e45b9-668d-422d-ae1c-2ca61c39401c\") " Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.222710 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8e45b9-668d-422d-ae1c-2ca61c39401c-kube-api-access-ggwjg" (OuterVolumeSpecName: "kube-api-access-ggwjg") pod "cb8e45b9-668d-422d-ae1c-2ca61c39401c" (UID: "cb8e45b9-668d-422d-ae1c-2ca61c39401c"). InnerVolumeSpecName "kube-api-access-ggwjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.316910 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggwjg\" (UniqueName: \"kubernetes.io/projected/cb8e45b9-668d-422d-ae1c-2ca61c39401c-kube-api-access-ggwjg\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.682366 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" event={"ID":"cb8e45b9-668d-422d-ae1c-2ca61c39401c","Type":"ContainerDied","Data":"50b8e49b8ffa121698ac6f8d7f9e20a2cca63129095f05d08ff629288aced853"} Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.682448 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b8e49b8ffa121698ac6f8d7f9e20a2cca63129095f05d08ff629288aced853" Mar 19 21:00:07 crc kubenswrapper[4799]: I0319 21:00:07.682487 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565900-dw4qs" Mar 19 21:00:08 crc kubenswrapper[4799]: I0319 21:00:08.118067 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:00:08 crc kubenswrapper[4799]: E0319 21:00:08.119649 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:00:08 crc kubenswrapper[4799]: I0319 21:00:08.217119 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565894-6nf54"] Mar 19 21:00:08 crc kubenswrapper[4799]: I0319 21:00:08.227374 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565894-6nf54"] Mar 19 21:00:09 crc kubenswrapper[4799]: I0319 21:00:09.127313 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be40fe5-fb7f-45c0-b312-30abdc732828" path="/var/lib/kubelet/pods/4be40fe5-fb7f-45c0-b312-30abdc732828/volumes" Mar 19 21:00:19 crc kubenswrapper[4799]: I0319 21:00:19.116767 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:00:19 crc kubenswrapper[4799]: E0319 21:00:19.117481 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:00:21 crc kubenswrapper[4799]: I0319 21:00:21.674077 4799 scope.go:117] "RemoveContainer" containerID="6bb1ff36d0a799238ce792173817baca9762c92e6379ac10e817f97dcea79059" Mar 19 21:00:21 crc kubenswrapper[4799]: I0319 21:00:21.727643 4799 scope.go:117] "RemoveContainer" containerID="80f2ed4df3f02d5343b74092621b46a2cadb33f009a1b3f5079c774694ed6edd" Mar 19 21:00:32 crc kubenswrapper[4799]: I0319 21:00:32.116248 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:00:32 crc kubenswrapper[4799]: E0319 21:00:32.117335 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:00:32 crc kubenswrapper[4799]: I0319 21:00:32.964181 4799 generic.go:334] "Generic (PLEG): container finished" podID="66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" containerID="8546b77afa6197cbd043fd216e8bccc4b5fca9b9c0891b4e0911c98d29c84dba" exitCode=0 Mar 19 21:00:32 crc kubenswrapper[4799]: I0319 21:00:32.964256 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a","Type":"ContainerDied","Data":"8546b77afa6197cbd043fd216e8bccc4b5fca9b9c0891b4e0911c98d29c84dba"} Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.468328 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635587 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ca-certs\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635649 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-config-data\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635709 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ssh-key\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635761 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635831 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-workdir\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635897 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.635966 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-temporary\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.636036 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config-secret\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.636108 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7pq4\" (UniqueName: \"kubernetes.io/projected/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-kube-api-access-j7pq4\") pod \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\" (UID: \"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a\") " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.643651 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.644443 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-config-data" (OuterVolumeSpecName: "config-data") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.644979 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.645011 4799 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.648691 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-kube-api-access-j7pq4" (OuterVolumeSpecName: "kube-api-access-j7pq4") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "kube-api-access-j7pq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.649184 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.649679 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.669793 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.677843 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.683688 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.721286 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" (UID: "66572ad6-a9d4-4dc7-ae3e-61a1d67d928a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747056 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747107 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7pq4\" (UniqueName: \"kubernetes.io/projected/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-kube-api-access-j7pq4\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747125 4799 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747149 4799 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747168 4799 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747186 4799 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66572ad6-a9d4-4dc7-ae3e-61a1d67d928a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.747227 4799 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.782712 4799 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.849894 4799 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.988004 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66572ad6-a9d4-4dc7-ae3e-61a1d67d928a","Type":"ContainerDied","Data":"b748b240e734a874366a43fddb43f8c1fcfbcb51043d5e306f578a35f35668ab"} Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.988055 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b748b240e734a874366a43fddb43f8c1fcfbcb51043d5e306f578a35f35668ab" Mar 19 21:00:34 crc kubenswrapper[4799]: I0319 21:00:34.988134 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 21:00:44 crc kubenswrapper[4799]: I0319 21:00:44.116355 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:00:44 crc kubenswrapper[4799]: E0319 21:00:44.117724 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.760326 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 21:00:46 crc kubenswrapper[4799]: E0319 21:00:46.761366 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8e45b9-668d-422d-ae1c-2ca61c39401c" containerName="oc" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.761414 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8e45b9-668d-422d-ae1c-2ca61c39401c" containerName="oc" Mar 19 21:00:46 crc kubenswrapper[4799]: E0319 21:00:46.761442 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395dc3d1-aa64-48b9-acc0-eec818c4a73b" containerName="collect-profiles" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.761458 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="395dc3d1-aa64-48b9-acc0-eec818c4a73b" containerName="collect-profiles" Mar 19 21:00:46 crc kubenswrapper[4799]: E0319 21:00:46.761485 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" containerName="tempest-tests-tempest-tests-runner" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.761502 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" containerName="tempest-tests-tempest-tests-runner" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.761853 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb8e45b9-668d-422d-ae1c-2ca61c39401c" containerName="oc" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.761872 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="66572ad6-a9d4-4dc7-ae3e-61a1d67d928a" containerName="tempest-tests-tempest-tests-runner" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.761896 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="395dc3d1-aa64-48b9-acc0-eec818c4a73b" containerName="collect-profiles" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.762957 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.765926 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wxdjz" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.793743 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.938141 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:46 crc kubenswrapper[4799]: I0319 21:00:46.938289 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ppn\" (UniqueName: \"kubernetes.io/projected/e3586638-0807-4ea9-9027-0b953c5ea3cb-kube-api-access-x9ppn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.040080 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.040184 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ppn\" (UniqueName: \"kubernetes.io/projected/e3586638-0807-4ea9-9027-0b953c5ea3cb-kube-api-access-x9ppn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.040801 4799 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.069590 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ppn\" (UniqueName: \"kubernetes.io/projected/e3586638-0807-4ea9-9027-0b953c5ea3cb-kube-api-access-x9ppn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.101589 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e3586638-0807-4ea9-9027-0b953c5ea3cb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.394040 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 21:00:47 crc kubenswrapper[4799]: I0319 21:00:47.961921 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 21:00:48 crc kubenswrapper[4799]: I0319 21:00:48.170270 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e3586638-0807-4ea9-9027-0b953c5ea3cb","Type":"ContainerStarted","Data":"36a61b70bd0feedcdb7431640f7a3bad2fd50d3cd02b0d0620db75974f67ac8d"} Mar 19 21:00:49 crc kubenswrapper[4799]: I0319 21:00:49.183692 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e3586638-0807-4ea9-9027-0b953c5ea3cb","Type":"ContainerStarted","Data":"80856189b7555b925fb5d2dba862280965899f6fb0eaa98d636ca006cc70a389"} Mar 19 21:00:49 crc kubenswrapper[4799]: I0319 21:00:49.211823 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.304972304 podStartE2EDuration="3.211801268s" podCreationTimestamp="2026-03-19 21:00:46 +0000 UTC" firstStartedPulling="2026-03-19 21:00:47.964808303 +0000 UTC m=+3325.570761405" lastFinishedPulling="2026-03-19 21:00:48.871637297 +0000 UTC m=+3326.477590369" observedRunningTime="2026-03-19 21:00:49.199362888 +0000 UTC m=+3326.805316000" watchObservedRunningTime="2026-03-19 21:00:49.211801268 +0000 UTC m=+3326.817754340" Mar 19 21:00:57 crc kubenswrapper[4799]: I0319 21:00:57.116445 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:00:57 crc kubenswrapper[4799]: E0319 21:00:57.117590 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.171700 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565901-g94sr"] Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.175279 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.202617 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565901-g94sr"] Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.232630 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-combined-ca-bundle\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.233082 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxpg\" (UniqueName: \"kubernetes.io/projected/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-kube-api-access-kqxpg\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.233272 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-config-data\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.233534 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-fernet-keys\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.335419 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-fernet-keys\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.335563 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-combined-ca-bundle\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.335664 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxpg\" (UniqueName: \"kubernetes.io/projected/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-kube-api-access-kqxpg\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.335702 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-config-data\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.347022 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-combined-ca-bundle\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.348602 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-config-data\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.350977 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-fernet-keys\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.361104 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxpg\" (UniqueName: \"kubernetes.io/projected/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-kube-api-access-kqxpg\") pod \"keystone-cron-29565901-g94sr\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.497688 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:00 crc kubenswrapper[4799]: I0319 21:01:00.998251 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565901-g94sr"] Mar 19 21:01:01 crc kubenswrapper[4799]: I0319 21:01:01.325723 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565901-g94sr" event={"ID":"88b1db62-9d2d-4d6b-8a37-d45f1feb2319","Type":"ContainerStarted","Data":"05bf638c66879c9e036696af803dd47501a433b51539295af8a49b586e149d7e"} Mar 19 21:01:01 crc kubenswrapper[4799]: I0319 21:01:01.325788 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565901-g94sr" event={"ID":"88b1db62-9d2d-4d6b-8a37-d45f1feb2319","Type":"ContainerStarted","Data":"ad264f08c84bb0db6fe8504207856febc017aa72f2dfe611f6c41f7251b37b5c"} Mar 19 21:01:01 crc kubenswrapper[4799]: I0319 21:01:01.343556 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565901-g94sr" podStartSLOduration=1.343537995 podStartE2EDuration="1.343537995s" podCreationTimestamp="2026-03-19 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:01:01.337722121 +0000 UTC m=+3338.943675203" watchObservedRunningTime="2026-03-19 21:01:01.343537995 +0000 UTC m=+3338.949491077" Mar 19 21:01:04 crc kubenswrapper[4799]: I0319 21:01:04.361139 4799 generic.go:334] "Generic (PLEG): container finished" podID="88b1db62-9d2d-4d6b-8a37-d45f1feb2319" containerID="05bf638c66879c9e036696af803dd47501a433b51539295af8a49b586e149d7e" exitCode=0 Mar 19 21:01:04 crc kubenswrapper[4799]: I0319 21:01:04.361187 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565901-g94sr" event={"ID":"88b1db62-9d2d-4d6b-8a37-d45f1feb2319","Type":"ContainerDied","Data":"05bf638c66879c9e036696af803dd47501a433b51539295af8a49b586e149d7e"} Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.730616 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.858889 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-fernet-keys\") pod \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.859365 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqxpg\" (UniqueName: \"kubernetes.io/projected/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-kube-api-access-kqxpg\") pod \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.859484 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-config-data\") pod \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.859604 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-combined-ca-bundle\") pod \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\" (UID: \"88b1db62-9d2d-4d6b-8a37-d45f1feb2319\") " Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.866329 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-kube-api-access-kqxpg" (OuterVolumeSpecName: "kube-api-access-kqxpg") pod "88b1db62-9d2d-4d6b-8a37-d45f1feb2319" (UID: "88b1db62-9d2d-4d6b-8a37-d45f1feb2319"). InnerVolumeSpecName "kube-api-access-kqxpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.867055 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "88b1db62-9d2d-4d6b-8a37-d45f1feb2319" (UID: "88b1db62-9d2d-4d6b-8a37-d45f1feb2319"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.891973 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88b1db62-9d2d-4d6b-8a37-d45f1feb2319" (UID: "88b1db62-9d2d-4d6b-8a37-d45f1feb2319"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.954931 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-config-data" (OuterVolumeSpecName: "config-data") pod "88b1db62-9d2d-4d6b-8a37-d45f1feb2319" (UID: "88b1db62-9d2d-4d6b-8a37-d45f1feb2319"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.962283 4799 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.962325 4799 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.962341 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqxpg\" (UniqueName: \"kubernetes.io/projected/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-kube-api-access-kqxpg\") on node \"crc\" DevicePath \"\"" Mar 19 21:01:05 crc kubenswrapper[4799]: I0319 21:01:05.962356 4799 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b1db62-9d2d-4d6b-8a37-d45f1feb2319-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 21:01:06 crc kubenswrapper[4799]: I0319 21:01:06.381805 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565901-g94sr" event={"ID":"88b1db62-9d2d-4d6b-8a37-d45f1feb2319","Type":"ContainerDied","Data":"ad264f08c84bb0db6fe8504207856febc017aa72f2dfe611f6c41f7251b37b5c"} Mar 19 21:01:06 crc kubenswrapper[4799]: I0319 21:01:06.381844 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad264f08c84bb0db6fe8504207856febc017aa72f2dfe611f6c41f7251b37b5c" Mar 19 21:01:06 crc kubenswrapper[4799]: I0319 21:01:06.381903 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565901-g94sr" Mar 19 21:01:12 crc kubenswrapper[4799]: I0319 21:01:12.117525 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:01:12 crc kubenswrapper[4799]: E0319 21:01:12.118061 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.778999 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9466g/must-gather-l749f"] Mar 19 21:01:13 crc kubenswrapper[4799]: E0319 21:01:13.781570 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b1db62-9d2d-4d6b-8a37-d45f1feb2319" containerName="keystone-cron" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.781606 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b1db62-9d2d-4d6b-8a37-d45f1feb2319" containerName="keystone-cron" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.781912 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b1db62-9d2d-4d6b-8a37-d45f1feb2319" containerName="keystone-cron" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.783619 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.795087 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9466g"/"kube-root-ca.crt" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.795463 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9466g"/"openshift-service-ca.crt" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.817747 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9466g/must-gather-l749f"] Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.932939 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc2hp\" (UniqueName: \"kubernetes.io/projected/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-kube-api-access-dc2hp\") pod \"must-gather-l749f\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:13 crc kubenswrapper[4799]: I0319 21:01:13.933103 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-must-gather-output\") pod \"must-gather-l749f\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:14 crc kubenswrapper[4799]: I0319 21:01:14.034554 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-must-gather-output\") pod \"must-gather-l749f\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:14 crc kubenswrapper[4799]: I0319 21:01:14.034837 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc2hp\" (UniqueName: \"kubernetes.io/projected/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-kube-api-access-dc2hp\") pod \"must-gather-l749f\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:14 crc kubenswrapper[4799]: I0319 21:01:14.035194 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-must-gather-output\") pod \"must-gather-l749f\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:14 crc kubenswrapper[4799]: I0319 21:01:14.052782 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc2hp\" (UniqueName: \"kubernetes.io/projected/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-kube-api-access-dc2hp\") pod \"must-gather-l749f\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:14 crc kubenswrapper[4799]: I0319 21:01:14.127527 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:01:14 crc kubenswrapper[4799]: I0319 21:01:14.611168 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9466g/must-gather-l749f"] Mar 19 21:01:15 crc kubenswrapper[4799]: I0319 21:01:15.503812 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/must-gather-l749f" event={"ID":"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd","Type":"ContainerStarted","Data":"40f41458c30c60bdf1066fb8e377a024cca7bed62f0f855273ed03b0e731208e"} Mar 19 21:01:22 crc kubenswrapper[4799]: I0319 21:01:22.581876 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/must-gather-l749f" event={"ID":"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd","Type":"ContainerStarted","Data":"552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24"} Mar 19 21:01:22 crc kubenswrapper[4799]: I0319 21:01:22.582448 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/must-gather-l749f" event={"ID":"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd","Type":"ContainerStarted","Data":"02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5"} Mar 19 21:01:22 crc kubenswrapper[4799]: I0319 21:01:22.616101 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9466g/must-gather-l749f" podStartSLOduration=2.699661805 podStartE2EDuration="9.616065448s" podCreationTimestamp="2026-03-19 21:01:13 +0000 UTC" firstStartedPulling="2026-03-19 21:01:14.613965938 +0000 UTC m=+3352.219919030" lastFinishedPulling="2026-03-19 21:01:21.530369581 +0000 UTC m=+3359.136322673" observedRunningTime="2026-03-19 21:01:22.607012573 +0000 UTC m=+3360.212965675" watchObservedRunningTime="2026-03-19 21:01:22.616065448 +0000 UTC m=+3360.222018620" Mar 19 21:01:24 crc kubenswrapper[4799]: I0319 21:01:24.115934 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:01:24 crc kubenswrapper[4799]: E0319 21:01:24.116670 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.388096 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9466g/crc-debug-qs7sk"] Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.389716 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.391516 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9466g"/"default-dockercfg-brthb" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.544927 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnns2\" (UniqueName: \"kubernetes.io/projected/329fe091-50a0-412b-b372-72ee0b761360-kube-api-access-lnns2\") pod \"crc-debug-qs7sk\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.545020 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/329fe091-50a0-412b-b372-72ee0b761360-host\") pod \"crc-debug-qs7sk\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.647223 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/329fe091-50a0-412b-b372-72ee0b761360-host\") pod \"crc-debug-qs7sk\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.647405 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/329fe091-50a0-412b-b372-72ee0b761360-host\") pod \"crc-debug-qs7sk\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.647434 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnns2\" (UniqueName: \"kubernetes.io/projected/329fe091-50a0-412b-b372-72ee0b761360-kube-api-access-lnns2\") pod \"crc-debug-qs7sk\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.680970 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnns2\" (UniqueName: \"kubernetes.io/projected/329fe091-50a0-412b-b372-72ee0b761360-kube-api-access-lnns2\") pod \"crc-debug-qs7sk\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: I0319 21:01:25.708081 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:01:25 crc kubenswrapper[4799]: W0319 21:01:25.747139 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329fe091_50a0_412b_b372_72ee0b761360.slice/crio-4c785a719be28c22fc03e2b018248f3c184980929cbaff22bee3112087121dbe WatchSource:0}: Error finding container 4c785a719be28c22fc03e2b018248f3c184980929cbaff22bee3112087121dbe: Status 404 returned error can't find the container with id 4c785a719be28c22fc03e2b018248f3c184980929cbaff22bee3112087121dbe Mar 19 21:01:26 crc kubenswrapper[4799]: I0319 21:01:26.620625 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-qs7sk" event={"ID":"329fe091-50a0-412b-b372-72ee0b761360","Type":"ContainerStarted","Data":"4c785a719be28c22fc03e2b018248f3c184980929cbaff22bee3112087121dbe"} Mar 19 21:01:37 crc kubenswrapper[4799]: I0319 21:01:37.116150 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:01:37 crc kubenswrapper[4799]: E0319 21:01:37.116913 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:01:37 crc kubenswrapper[4799]: I0319 21:01:37.715664 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-qs7sk" event={"ID":"329fe091-50a0-412b-b372-72ee0b761360","Type":"ContainerStarted","Data":"3b20e0a7a7a8f6d54a23c6e836c4afc99b9bec34673fb2941fdb159bed185d00"} Mar 19 21:01:37 crc kubenswrapper[4799]: I0319 21:01:37.737410 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9466g/crc-debug-qs7sk" podStartSLOduration=1.8120476920000002 podStartE2EDuration="12.737392767s" podCreationTimestamp="2026-03-19 21:01:25 +0000 UTC" firstStartedPulling="2026-03-19 21:01:25.748940941 +0000 UTC m=+3363.354894013" lastFinishedPulling="2026-03-19 21:01:36.674286016 +0000 UTC m=+3374.280239088" observedRunningTime="2026-03-19 21:01:37.736175463 +0000 UTC m=+3375.342128565" watchObservedRunningTime="2026-03-19 21:01:37.737392767 +0000 UTC m=+3375.343345839" Mar 19 21:01:37 crc kubenswrapper[4799]: I0319 21:01:37.998354 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pwlkz"] Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.000651 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.011080 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwlkz"] Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.092251 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6q9h\" (UniqueName: \"kubernetes.io/projected/9e69acdf-6e20-410a-9111-627dcf26baeb-kube-api-access-m6q9h\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.092355 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-catalog-content\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.092400 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-utilities\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.194956 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-catalog-content\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.195023 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-utilities\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.195175 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6q9h\" (UniqueName: \"kubernetes.io/projected/9e69acdf-6e20-410a-9111-627dcf26baeb-kube-api-access-m6q9h\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.195934 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-catalog-content\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.196042 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-utilities\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.218713 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6q9h\" (UniqueName: \"kubernetes.io/projected/9e69acdf-6e20-410a-9111-627dcf26baeb-kube-api-access-m6q9h\") pod \"redhat-operators-pwlkz\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.322226 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:38 crc kubenswrapper[4799]: I0319 21:01:38.929804 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwlkz"] Mar 19 21:01:39 crc kubenswrapper[4799]: I0319 21:01:39.765837 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerID="15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b" exitCode=0 Mar 19 21:01:39 crc kubenswrapper[4799]: I0319 21:01:39.766096 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerDied","Data":"15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b"} Mar 19 21:01:39 crc kubenswrapper[4799]: I0319 21:01:39.766140 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerStarted","Data":"a4b3e4d616cc410bb029195fc1eaf974ee556834a05169761cc8723838bcf8e8"} Mar 19 21:01:41 crc kubenswrapper[4799]: I0319 21:01:41.783747 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerStarted","Data":"abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d"} Mar 19 21:01:44 crc kubenswrapper[4799]: I0319 21:01:44.828514 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerID="abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d" exitCode=0 Mar 19 21:01:44 crc kubenswrapper[4799]: I0319 21:01:44.828584 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerDied","Data":"abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d"} Mar 19 21:01:45 crc kubenswrapper[4799]: I0319 21:01:45.851027 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerStarted","Data":"30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c"} Mar 19 21:01:45 crc kubenswrapper[4799]: I0319 21:01:45.872475 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pwlkz" podStartSLOduration=3.35810933 podStartE2EDuration="8.872458166s" podCreationTimestamp="2026-03-19 21:01:37 +0000 UTC" firstStartedPulling="2026-03-19 21:01:39.768162902 +0000 UTC m=+3377.374115974" lastFinishedPulling="2026-03-19 21:01:45.282511738 +0000 UTC m=+3382.888464810" observedRunningTime="2026-03-19 21:01:45.870008517 +0000 UTC m=+3383.475961589" watchObservedRunningTime="2026-03-19 21:01:45.872458166 +0000 UTC m=+3383.478411238" Mar 19 21:01:48 crc kubenswrapper[4799]: I0319 21:01:48.322562 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:48 crc kubenswrapper[4799]: I0319 21:01:48.322898 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:49 crc kubenswrapper[4799]: I0319 21:01:49.374633 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pwlkz" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="registry-server" probeResult="failure" output=< Mar 19 21:01:49 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 21:01:49 crc kubenswrapper[4799]: > Mar 19 21:01:51 crc kubenswrapper[4799]: I0319 21:01:51.116329 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:01:51 crc kubenswrapper[4799]: E0319 21:01:51.117124 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:01:58 crc kubenswrapper[4799]: I0319 21:01:58.386669 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:58 crc kubenswrapper[4799]: I0319 21:01:58.446930 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:01:58 crc kubenswrapper[4799]: I0319 21:01:58.633316 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwlkz"] Mar 19 21:01:59 crc kubenswrapper[4799]: I0319 21:01:59.989902 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pwlkz" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="registry-server" containerID="cri-o://30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c" gracePeriod=2 Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.150471 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565902-qb8wl"] Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.151888 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.153968 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.154128 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.154312 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.173626 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565902-qb8wl"] Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.211895 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g2g8\" (UniqueName: \"kubernetes.io/projected/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7-kube-api-access-6g2g8\") pod \"auto-csr-approver-29565902-qb8wl\" (UID: \"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7\") " pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.312772 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g2g8\" (UniqueName: \"kubernetes.io/projected/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7-kube-api-access-6g2g8\") pod \"auto-csr-approver-29565902-qb8wl\" (UID: \"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7\") " pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.351757 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g2g8\" (UniqueName: \"kubernetes.io/projected/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7-kube-api-access-6g2g8\") pod \"auto-csr-approver-29565902-qb8wl\" (UID: \"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7\") " pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.517500 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.606988 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.718748 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-utilities\") pod \"9e69acdf-6e20-410a-9111-627dcf26baeb\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.719154 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6q9h\" (UniqueName: \"kubernetes.io/projected/9e69acdf-6e20-410a-9111-627dcf26baeb-kube-api-access-m6q9h\") pod \"9e69acdf-6e20-410a-9111-627dcf26baeb\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.719176 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-catalog-content\") pod \"9e69acdf-6e20-410a-9111-627dcf26baeb\" (UID: \"9e69acdf-6e20-410a-9111-627dcf26baeb\") " Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.720277 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-utilities" (OuterVolumeSpecName: "utilities") pod "9e69acdf-6e20-410a-9111-627dcf26baeb" (UID: "9e69acdf-6e20-410a-9111-627dcf26baeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.724615 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e69acdf-6e20-410a-9111-627dcf26baeb-kube-api-access-m6q9h" (OuterVolumeSpecName: "kube-api-access-m6q9h") pod "9e69acdf-6e20-410a-9111-627dcf26baeb" (UID: "9e69acdf-6e20-410a-9111-627dcf26baeb"). InnerVolumeSpecName "kube-api-access-m6q9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.827429 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.827465 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6q9h\" (UniqueName: \"kubernetes.io/projected/9e69acdf-6e20-410a-9111-627dcf26baeb-kube-api-access-m6q9h\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.862303 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e69acdf-6e20-410a-9111-627dcf26baeb" (UID: "9e69acdf-6e20-410a-9111-627dcf26baeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.929367 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e69acdf-6e20-410a-9111-627dcf26baeb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.998195 4799 generic.go:334] "Generic (PLEG): container finished" podID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerID="30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c" exitCode=0 Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.998249 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerDied","Data":"30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c"} Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.998283 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwlkz" event={"ID":"9e69acdf-6e20-410a-9111-627dcf26baeb","Type":"ContainerDied","Data":"a4b3e4d616cc410bb029195fc1eaf974ee556834a05169761cc8723838bcf8e8"} Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.998304 4799 scope.go:117] "RemoveContainer" containerID="30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c" Mar 19 21:02:00 crc kubenswrapper[4799]: I0319 21:02:00.998488 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwlkz" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.032447 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565902-qb8wl"] Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.033691 4799 scope.go:117] "RemoveContainer" containerID="abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.053793 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwlkz"] Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.074089 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pwlkz"] Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.076993 4799 scope.go:117] "RemoveContainer" containerID="15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.149486 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" path="/var/lib/kubelet/pods/9e69acdf-6e20-410a-9111-627dcf26baeb/volumes" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.176132 4799 scope.go:117] "RemoveContainer" containerID="30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c" Mar 19 21:02:01 crc kubenswrapper[4799]: E0319 21:02:01.176764 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c\": container with ID starting with 30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c not found: ID does not exist" containerID="30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.176847 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c"} err="failed to get container status \"30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c\": rpc error: code = NotFound desc = could not find container \"30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c\": container with ID starting with 30ff23ccdb5b7e96b5fedf58323fd63c998237e16f84c8a9717fccc0a68d3a4c not found: ID does not exist" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.176872 4799 scope.go:117] "RemoveContainer" containerID="abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d" Mar 19 21:02:01 crc kubenswrapper[4799]: E0319 21:02:01.177193 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d\": container with ID starting with abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d not found: ID does not exist" containerID="abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.177218 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d"} err="failed to get container status \"abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d\": rpc error: code = NotFound desc = could not find container \"abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d\": container with ID starting with abe94bbc55aa55e3734d3b9141cd4ce22888f28b7b10e3106e05a0c248780d1d not found: ID does not exist" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.177234 4799 scope.go:117] "RemoveContainer" containerID="15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b" Mar 19 21:02:01 crc kubenswrapper[4799]: E0319 21:02:01.177574 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b\": container with ID starting with 15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b not found: ID does not exist" containerID="15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b" Mar 19 21:02:01 crc kubenswrapper[4799]: I0319 21:02:01.177590 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b"} err="failed to get container status \"15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b\": rpc error: code = NotFound desc = could not find container \"15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b\": container with ID starting with 15e45ea2d688db64f698490c0d09481c0121c177f58da828d440288d354d7d4b not found: ID does not exist" Mar 19 21:02:02 crc kubenswrapper[4799]: I0319 21:02:02.008012 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" event={"ID":"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7","Type":"ContainerStarted","Data":"15233aa64203b27a2a58dae37cd6432b5153a1995ebf8667a7bd0e9bfb71c7fd"} Mar 19 21:02:03 crc kubenswrapper[4799]: I0319 21:02:03.017697 4799 generic.go:334] "Generic (PLEG): container finished" podID="13d5f387-63eb-4dce-b4c3-3ec1a012e0a7" containerID="4765a86c083715c0157d1fd86ab50831f58c139d05a62e294e09ba37ef00d353" exitCode=0 Mar 19 21:02:03 crc kubenswrapper[4799]: I0319 21:02:03.017748 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" event={"ID":"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7","Type":"ContainerDied","Data":"4765a86c083715c0157d1fd86ab50831f58c139d05a62e294e09ba37ef00d353"} Mar 19 21:02:03 crc kubenswrapper[4799]: I0319 21:02:03.128808 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:02:03 crc kubenswrapper[4799]: E0319 21:02:03.129151 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:02:04 crc kubenswrapper[4799]: I0319 21:02:04.421356 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:04 crc kubenswrapper[4799]: I0319 21:02:04.508487 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g2g8\" (UniqueName: \"kubernetes.io/projected/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7-kube-api-access-6g2g8\") pod \"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7\" (UID: \"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7\") " Mar 19 21:02:04 crc kubenswrapper[4799]: I0319 21:02:04.514623 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7-kube-api-access-6g2g8" (OuterVolumeSpecName: "kube-api-access-6g2g8") pod "13d5f387-63eb-4dce-b4c3-3ec1a012e0a7" (UID: "13d5f387-63eb-4dce-b4c3-3ec1a012e0a7"). InnerVolumeSpecName "kube-api-access-6g2g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:02:04 crc kubenswrapper[4799]: I0319 21:02:04.610858 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g2g8\" (UniqueName: \"kubernetes.io/projected/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7-kube-api-access-6g2g8\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:05 crc kubenswrapper[4799]: I0319 21:02:05.039541 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" event={"ID":"13d5f387-63eb-4dce-b4c3-3ec1a012e0a7","Type":"ContainerDied","Data":"15233aa64203b27a2a58dae37cd6432b5153a1995ebf8667a7bd0e9bfb71c7fd"} Mar 19 21:02:05 crc kubenswrapper[4799]: I0319 21:02:05.039576 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15233aa64203b27a2a58dae37cd6432b5153a1995ebf8667a7bd0e9bfb71c7fd" Mar 19 21:02:05 crc kubenswrapper[4799]: I0319 21:02:05.039643 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565902-qb8wl" Mar 19 21:02:05 crc kubenswrapper[4799]: I0319 21:02:05.565521 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565896-pzvm6"] Mar 19 21:02:05 crc kubenswrapper[4799]: I0319 21:02:05.602179 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565896-pzvm6"] Mar 19 21:02:07 crc kubenswrapper[4799]: I0319 21:02:07.125370 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43eb2d4-4e9e-4d8c-a149-731e1a0431dd" path="/var/lib/kubelet/pods/e43eb2d4-4e9e-4d8c-a149-731e1a0431dd/volumes" Mar 19 21:02:14 crc kubenswrapper[4799]: I0319 21:02:14.117012 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:02:14 crc kubenswrapper[4799]: E0319 21:02:14.118497 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:02:16 crc kubenswrapper[4799]: I0319 21:02:16.149246 4799 generic.go:334] "Generic (PLEG): container finished" podID="329fe091-50a0-412b-b372-72ee0b761360" containerID="3b20e0a7a7a8f6d54a23c6e836c4afc99b9bec34673fb2941fdb159bed185d00" exitCode=0 Mar 19 21:02:16 crc kubenswrapper[4799]: I0319 21:02:16.149306 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-qs7sk" event={"ID":"329fe091-50a0-412b-b372-72ee0b761360","Type":"ContainerDied","Data":"3b20e0a7a7a8f6d54a23c6e836c4afc99b9bec34673fb2941fdb159bed185d00"} Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.275531 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.308588 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9466g/crc-debug-qs7sk"] Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.318490 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9466g/crc-debug-qs7sk"] Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.401164 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/329fe091-50a0-412b-b372-72ee0b761360-host\") pod \"329fe091-50a0-412b-b372-72ee0b761360\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.401203 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnns2\" (UniqueName: \"kubernetes.io/projected/329fe091-50a0-412b-b372-72ee0b761360-kube-api-access-lnns2\") pod \"329fe091-50a0-412b-b372-72ee0b761360\" (UID: \"329fe091-50a0-412b-b372-72ee0b761360\") " Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.402784 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/329fe091-50a0-412b-b372-72ee0b761360-host" (OuterVolumeSpecName: "host") pod "329fe091-50a0-412b-b372-72ee0b761360" (UID: "329fe091-50a0-412b-b372-72ee0b761360"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.409534 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329fe091-50a0-412b-b372-72ee0b761360-kube-api-access-lnns2" (OuterVolumeSpecName: "kube-api-access-lnns2") pod "329fe091-50a0-412b-b372-72ee0b761360" (UID: "329fe091-50a0-412b-b372-72ee0b761360"). InnerVolumeSpecName "kube-api-access-lnns2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.502783 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnns2\" (UniqueName: \"kubernetes.io/projected/329fe091-50a0-412b-b372-72ee0b761360-kube-api-access-lnns2\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:17 crc kubenswrapper[4799]: I0319 21:02:17.502813 4799 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/329fe091-50a0-412b-b372-72ee0b761360-host\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.168330 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c785a719be28c22fc03e2b018248f3c184980929cbaff22bee3112087121dbe" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.168406 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-qs7sk" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.555835 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9466g/crc-debug-tnslc"] Mar 19 21:02:18 crc kubenswrapper[4799]: E0319 21:02:18.556836 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="registry-server" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.556860 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="registry-server" Mar 19 21:02:18 crc kubenswrapper[4799]: E0319 21:02:18.556873 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="extract-content" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.556884 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="extract-content" Mar 19 21:02:18 crc kubenswrapper[4799]: E0319 21:02:18.556915 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d5f387-63eb-4dce-b4c3-3ec1a012e0a7" containerName="oc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.556928 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d5f387-63eb-4dce-b4c3-3ec1a012e0a7" containerName="oc" Mar 19 21:02:18 crc kubenswrapper[4799]: E0319 21:02:18.556953 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="extract-utilities" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.556965 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="extract-utilities" Mar 19 21:02:18 crc kubenswrapper[4799]: E0319 21:02:18.556988 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329fe091-50a0-412b-b372-72ee0b761360" containerName="container-00" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.556997 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="329fe091-50a0-412b-b372-72ee0b761360" containerName="container-00" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.557316 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="329fe091-50a0-412b-b372-72ee0b761360" containerName="container-00" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.557349 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d5f387-63eb-4dce-b4c3-3ec1a012e0a7" containerName="oc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.557371 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e69acdf-6e20-410a-9111-627dcf26baeb" containerName="registry-server" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.558290 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.560851 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9466g"/"default-dockercfg-brthb" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.729703 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6hh\" (UniqueName: \"kubernetes.io/projected/106992e3-7813-4e73-b946-43be8a1eed8e-kube-api-access-vg6hh\") pod \"crc-debug-tnslc\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.730188 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/106992e3-7813-4e73-b946-43be8a1eed8e-host\") pod \"crc-debug-tnslc\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.832337 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/106992e3-7813-4e73-b946-43be8a1eed8e-host\") pod \"crc-debug-tnslc\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.832470 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6hh\" (UniqueName: \"kubernetes.io/projected/106992e3-7813-4e73-b946-43be8a1eed8e-kube-api-access-vg6hh\") pod \"crc-debug-tnslc\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.832888 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/106992e3-7813-4e73-b946-43be8a1eed8e-host\") pod \"crc-debug-tnslc\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.862638 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6hh\" (UniqueName: \"kubernetes.io/projected/106992e3-7813-4e73-b946-43be8a1eed8e-kube-api-access-vg6hh\") pod \"crc-debug-tnslc\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:18 crc kubenswrapper[4799]: I0319 21:02:18.886864 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:19 crc kubenswrapper[4799]: I0319 21:02:19.128090 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329fe091-50a0-412b-b372-72ee0b761360" path="/var/lib/kubelet/pods/329fe091-50a0-412b-b372-72ee0b761360/volumes" Mar 19 21:02:19 crc kubenswrapper[4799]: I0319 21:02:19.177861 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-tnslc" event={"ID":"106992e3-7813-4e73-b946-43be8a1eed8e","Type":"ContainerStarted","Data":"7d6584362e2a0c54e63cc1c2f4237f56dadcdad7fec794dbc4163db6b3050511"} Mar 19 21:02:19 crc kubenswrapper[4799]: I0319 21:02:19.177903 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-tnslc" event={"ID":"106992e3-7813-4e73-b946-43be8a1eed8e","Type":"ContainerStarted","Data":"7756bf2cae35abc812cb129dba0845a8876ef6075147026220b87f67c62fb2a5"} Mar 19 21:02:19 crc kubenswrapper[4799]: I0319 21:02:19.202258 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9466g/crc-debug-tnslc" podStartSLOduration=1.202232421 podStartE2EDuration="1.202232421s" podCreationTimestamp="2026-03-19 21:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:02:19.191688314 +0000 UTC m=+3416.797641386" watchObservedRunningTime="2026-03-19 21:02:19.202232421 +0000 UTC m=+3416.808185523" Mar 19 21:02:20 crc kubenswrapper[4799]: I0319 21:02:20.192446 4799 generic.go:334] "Generic (PLEG): container finished" podID="106992e3-7813-4e73-b946-43be8a1eed8e" containerID="7d6584362e2a0c54e63cc1c2f4237f56dadcdad7fec794dbc4163db6b3050511" exitCode=0 Mar 19 21:02:20 crc kubenswrapper[4799]: I0319 21:02:20.192887 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-tnslc" event={"ID":"106992e3-7813-4e73-b946-43be8a1eed8e","Type":"ContainerDied","Data":"7d6584362e2a0c54e63cc1c2f4237f56dadcdad7fec794dbc4163db6b3050511"} Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.302159 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.338833 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9466g/crc-debug-tnslc"] Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.346638 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9466g/crc-debug-tnslc"] Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.481272 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg6hh\" (UniqueName: \"kubernetes.io/projected/106992e3-7813-4e73-b946-43be8a1eed8e-kube-api-access-vg6hh\") pod \"106992e3-7813-4e73-b946-43be8a1eed8e\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.481343 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/106992e3-7813-4e73-b946-43be8a1eed8e-host\") pod \"106992e3-7813-4e73-b946-43be8a1eed8e\" (UID: \"106992e3-7813-4e73-b946-43be8a1eed8e\") " Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.481973 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/106992e3-7813-4e73-b946-43be8a1eed8e-host" (OuterVolumeSpecName: "host") pod "106992e3-7813-4e73-b946-43be8a1eed8e" (UID: "106992e3-7813-4e73-b946-43be8a1eed8e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.488058 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106992e3-7813-4e73-b946-43be8a1eed8e-kube-api-access-vg6hh" (OuterVolumeSpecName: "kube-api-access-vg6hh") pod "106992e3-7813-4e73-b946-43be8a1eed8e" (UID: "106992e3-7813-4e73-b946-43be8a1eed8e"). InnerVolumeSpecName "kube-api-access-vg6hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.583871 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg6hh\" (UniqueName: \"kubernetes.io/projected/106992e3-7813-4e73-b946-43be8a1eed8e-kube-api-access-vg6hh\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.583904 4799 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/106992e3-7813-4e73-b946-43be8a1eed8e-host\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:21 crc kubenswrapper[4799]: I0319 21:02:21.875365 4799 scope.go:117] "RemoveContainer" containerID="b84c2eaa9ebbde8cae13e91dc50a7741979ccb7a9ceff1a910b03101170653c6" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.213323 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7756bf2cae35abc812cb129dba0845a8876ef6075147026220b87f67c62fb2a5" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.213442 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-tnslc" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.539978 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9466g/crc-debug-csczp"] Mar 19 21:02:22 crc kubenswrapper[4799]: E0319 21:02:22.540342 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106992e3-7813-4e73-b946-43be8a1eed8e" containerName="container-00" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.540353 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="106992e3-7813-4e73-b946-43be8a1eed8e" containerName="container-00" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.540549 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="106992e3-7813-4e73-b946-43be8a1eed8e" containerName="container-00" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.541150 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.542748 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9466g"/"default-dockercfg-brthb" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.705034 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64efd86f-b043-4b0e-af5b-1f455bbaef15-host\") pod \"crc-debug-csczp\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.705227 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgsp\" (UniqueName: \"kubernetes.io/projected/64efd86f-b043-4b0e-af5b-1f455bbaef15-kube-api-access-fzgsp\") pod \"crc-debug-csczp\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.807849 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64efd86f-b043-4b0e-af5b-1f455bbaef15-host\") pod \"crc-debug-csczp\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.808030 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgsp\" (UniqueName: \"kubernetes.io/projected/64efd86f-b043-4b0e-af5b-1f455bbaef15-kube-api-access-fzgsp\") pod \"crc-debug-csczp\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.808185 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64efd86f-b043-4b0e-af5b-1f455bbaef15-host\") pod \"crc-debug-csczp\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.839883 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgsp\" (UniqueName: \"kubernetes.io/projected/64efd86f-b043-4b0e-af5b-1f455bbaef15-kube-api-access-fzgsp\") pod \"crc-debug-csczp\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: I0319 21:02:22.872480 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:22 crc kubenswrapper[4799]: W0319 21:02:22.905583 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-623ca6d774c6d47d8eee67b81d5492e3327af8e7fbb1eb622180a3eaa2990dda WatchSource:0}: Error finding container 623ca6d774c6d47d8eee67b81d5492e3327af8e7fbb1eb622180a3eaa2990dda: Status 404 returned error can't find the container with id 623ca6d774c6d47d8eee67b81d5492e3327af8e7fbb1eb622180a3eaa2990dda Mar 19 21:02:23 crc kubenswrapper[4799]: I0319 21:02:23.154734 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106992e3-7813-4e73-b946-43be8a1eed8e" path="/var/lib/kubelet/pods/106992e3-7813-4e73-b946-43be8a1eed8e/volumes" Mar 19 21:02:23 crc kubenswrapper[4799]: I0319 21:02:23.226062 4799 generic.go:334] "Generic (PLEG): container finished" podID="64efd86f-b043-4b0e-af5b-1f455bbaef15" containerID="e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b" exitCode=0 Mar 19 21:02:23 crc kubenswrapper[4799]: I0319 21:02:23.226112 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-csczp" event={"ID":"64efd86f-b043-4b0e-af5b-1f455bbaef15","Type":"ContainerDied","Data":"e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b"} Mar 19 21:02:23 crc kubenswrapper[4799]: I0319 21:02:23.226151 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/crc-debug-csczp" event={"ID":"64efd86f-b043-4b0e-af5b-1f455bbaef15","Type":"ContainerStarted","Data":"623ca6d774c6d47d8eee67b81d5492e3327af8e7fbb1eb622180a3eaa2990dda"} Mar 19 21:02:23 crc kubenswrapper[4799]: I0319 21:02:23.261894 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9466g/crc-debug-csczp"] Mar 19 21:02:23 crc kubenswrapper[4799]: I0319 21:02:23.270578 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9466g/crc-debug-csczp"] Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.363562 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.541871 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64efd86f-b043-4b0e-af5b-1f455bbaef15-host\") pod \"64efd86f-b043-4b0e-af5b-1f455bbaef15\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.541980 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzgsp\" (UniqueName: \"kubernetes.io/projected/64efd86f-b043-4b0e-af5b-1f455bbaef15-kube-api-access-fzgsp\") pod \"64efd86f-b043-4b0e-af5b-1f455bbaef15\" (UID: \"64efd86f-b043-4b0e-af5b-1f455bbaef15\") " Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.542068 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64efd86f-b043-4b0e-af5b-1f455bbaef15-host" (OuterVolumeSpecName: "host") pod "64efd86f-b043-4b0e-af5b-1f455bbaef15" (UID: "64efd86f-b043-4b0e-af5b-1f455bbaef15"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.542684 4799 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64efd86f-b043-4b0e-af5b-1f455bbaef15-host\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.549997 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64efd86f-b043-4b0e-af5b-1f455bbaef15-kube-api-access-fzgsp" (OuterVolumeSpecName: "kube-api-access-fzgsp") pod "64efd86f-b043-4b0e-af5b-1f455bbaef15" (UID: "64efd86f-b043-4b0e-af5b-1f455bbaef15"). InnerVolumeSpecName "kube-api-access-fzgsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:02:24 crc kubenswrapper[4799]: I0319 21:02:24.644965 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzgsp\" (UniqueName: \"kubernetes.io/projected/64efd86f-b043-4b0e-af5b-1f455bbaef15-kube-api-access-fzgsp\") on node \"crc\" DevicePath \"\"" Mar 19 21:02:25 crc kubenswrapper[4799]: I0319 21:02:25.133483 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64efd86f-b043-4b0e-af5b-1f455bbaef15" path="/var/lib/kubelet/pods/64efd86f-b043-4b0e-af5b-1f455bbaef15/volumes" Mar 19 21:02:25 crc kubenswrapper[4799]: I0319 21:02:25.248229 4799 scope.go:117] "RemoveContainer" containerID="e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b" Mar 19 21:02:25 crc kubenswrapper[4799]: I0319 21:02:25.248318 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/crc-debug-csczp" Mar 19 21:02:29 crc kubenswrapper[4799]: I0319 21:02:29.116568 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:02:29 crc kubenswrapper[4799]: E0319 21:02:29.117155 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:02:31 crc kubenswrapper[4799]: E0319 21:02:31.163146 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-conmon-e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 21:02:40 crc kubenswrapper[4799]: I0319 21:02:40.907246 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-868b778b64-pzgfd_c6e6e053-3361-4d22-9ef7-fd7e96b77cf4/barbican-api/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.127480 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-868b778b64-pzgfd_c6e6e053-3361-4d22-9ef7-fd7e96b77cf4/barbican-api-log/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.140595 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-776dcdd75d-ljvsd_4b710194-7925-42ab-b779-be3f32094307/barbican-keystone-listener/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.161116 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-776dcdd75d-ljvsd_4b710194-7925-42ab-b779-be3f32094307/barbican-keystone-listener-log/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.296153 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-784d54cf6f-wm6ts_c6596c03-a397-4f22-b511-86e89635a92a/barbican-worker/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.352280 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-784d54cf6f-wm6ts_c6596c03-a397-4f22-b511-86e89635a92a/barbican-worker-log/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: E0319 21:02:41.378083 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-conmon-e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.508952 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9_8befdbab-e306-4827-98a7-f042c02380ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.575555 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/ceilometer-central-agent/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.653255 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/ceilometer-notification-agent/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.686710 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/proxy-httpd/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.746637 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/sg-core/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.851960 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_01d335a7-09e5-4073-bf70-5ac03807ff12/cinder-api/0.log" Mar 19 21:02:41 crc kubenswrapper[4799]: I0319 21:02:41.882515 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_01d335a7-09e5-4073-bf70-5ac03807ff12/cinder-api-log/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.004959 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6c780df-41c8-47d8-af6d-3dcefb770b8d/cinder-scheduler/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.059737 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6c780df-41c8-47d8-af6d-3dcefb770b8d/probe/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.154875 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw_8886a9f8-8f15-43f4-a721-3a487d2ff6f7/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.243580 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh_e06b9a2d-57e1-4c4d-a08e-d0c2c343130d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.362430 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79db78f56f-q2stc_9ae6606b-efd1-4bf8-a13f-c14d96bbaa99/init/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.573701 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79db78f56f-q2stc_9ae6606b-efd1-4bf8-a13f-c14d96bbaa99/dnsmasq-dns/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.591071 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vgxll_8e141b61-4ffc-407f-a554-58f8176b1b18/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.597931 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79db78f56f-q2stc_9ae6606b-efd1-4bf8-a13f-c14d96bbaa99/init/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.759214 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ec6464f3-4c36-4387-8127-56a300a1d79c/glance-httpd/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.775962 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ec6464f3-4c36-4387-8127-56a300a1d79c/glance-log/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.949864 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a4455494-ef4e-4f95-87d6-cd495059bb9a/glance-log/0.log" Mar 19 21:02:42 crc kubenswrapper[4799]: I0319 21:02:42.988045 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a4455494-ef4e-4f95-87d6-cd495059bb9a/glance-httpd/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.090353 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56454c8868-kxl79_d9b7bec9-2633-410d-be4e-c65c9a903a38/horizon/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.296371 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nc852_9e20098d-7fb1-4aba-b460-1440751e6dc2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.387555 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56454c8868-kxl79_d9b7bec9-2633-410d-be4e-c65c9a903a38/horizon-log/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.435970 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4wbn8_416c30d9-5442-418f-a668-fcca8c4804a2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.642865 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565901-g94sr_88b1db62-9d2d-4d6b-8a37-d45f1feb2319/keystone-cron/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.662395 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76f77cb758-wwjbp_0eea0f4a-eab7-4dae-844b-f614654cd6d4/keystone-api/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.818861 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_50808de7-9788-451a-8910-6be8f217ae09/kube-state-metrics/0.log" Mar 19 21:02:43 crc kubenswrapper[4799]: I0319 21:02:43.891172 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6_880fa93c-771b-4896-9ab3-cb7d0a9ab3e0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:44 crc kubenswrapper[4799]: I0319 21:02:44.116516 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:02:44 crc kubenswrapper[4799]: E0319 21:02:44.116798 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:02:44 crc kubenswrapper[4799]: I0319 21:02:44.224102 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77768f5c85-6lgxw_f2d71d6b-73c6-4edf-8ef2-5295c628603c/neutron-api/0.log" Mar 19 21:02:44 crc kubenswrapper[4799]: I0319 21:02:44.310896 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77768f5c85-6lgxw_f2d71d6b-73c6-4edf-8ef2-5295c628603c/neutron-httpd/0.log" Mar 19 21:02:44 crc kubenswrapper[4799]: I0319 21:02:44.423004 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6_28b384cd-034d-407f-a7f6-0b1b0ddffb4f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:44 crc kubenswrapper[4799]: I0319 21:02:44.877520 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d8aa3e34-ca13-45cd-960f-2973a80c80b8/nova-cell0-conductor-conductor/0.log" Mar 19 21:02:44 crc kubenswrapper[4799]: I0319 21:02:44.890668 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f822edd3-bebb-4a8a-9755-60419527dbde/nova-api-log/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.234911 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_df1146cf-6649-441e-b17f-bfbebbdc3439/nova-cell1-conductor-conductor/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.246836 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e45143c6-a88e-40c6-a6fc-5452da3be735/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.291897 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f822edd3-bebb-4a8a-9755-60419527dbde/nova-api-api/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.474630 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5x7fx_0317cfee-27aa-4ba1-9c6a-cf2b368c811b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.577002 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a1b0148b-b48b-44b2-9fee-1fd4389fbf77/nova-metadata-log/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.867639 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2add32a8-30b6-4000-bdd7-c96cda2bb599/nova-scheduler-scheduler/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.895748 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ad66907-e766-4e25-9e0c-03e2a0a803e6/mysql-bootstrap/0.log" Mar 19 21:02:45 crc kubenswrapper[4799]: I0319 21:02:45.927355 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a1b0148b-b48b-44b2-9fee-1fd4389fbf77/nova-metadata-metadata/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.112290 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ad66907-e766-4e25-9e0c-03e2a0a803e6/mysql-bootstrap/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.137099 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ad66907-e766-4e25-9e0c-03e2a0a803e6/galera/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.145965 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5b6d87e-0486-4c0c-9578-514626ca7579/mysql-bootstrap/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.388302 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5b6d87e-0486-4c0c-9578-514626ca7579/galera/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.395682 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_575c8839-3cdb-4137-967a-3544c626113f/openstackclient/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.395783 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5b6d87e-0486-4c0c-9578-514626ca7579/mysql-bootstrap/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.548833 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-h89k2_cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8/ovn-controller/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.637935 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q7s8k_ada5da2f-892a-4d4b-a18c-d641456e9124/openstack-network-exporter/0.log" Mar 19 21:02:46 crc kubenswrapper[4799]: I0319 21:02:46.848991 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovsdb-server-init/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.012701 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovsdb-server-init/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.069814 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovsdb-server/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.132329 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovs-vswitchd/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.211562 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c9nbt_b526ddd9-685d-42a9-8598-d5dd7710942a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.302598 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_df45d4ef-7350-42d7-a2d7-cede9b13ff55/openstack-network-exporter/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.361145 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_df45d4ef-7350-42d7-a2d7-cede9b13ff55/ovn-northd/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.507089 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a3d93760-46c9-46e0-aff9-38ad08bad16b/openstack-network-exporter/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.523186 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a3d93760-46c9-46e0-aff9-38ad08bad16b/ovsdbserver-nb/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.647222 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_379ddd3c-6b22-4ccd-90a0-8c1cce4a572b/openstack-network-exporter/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.704500 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_379ddd3c-6b22-4ccd-90a0-8c1cce4a572b/ovsdbserver-sb/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.834167 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-686489978d-5lwnf_55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3/placement-api/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.907186 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-686489978d-5lwnf_55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3/placement-log/0.log" Mar 19 21:02:47 crc kubenswrapper[4799]: I0319 21:02:47.952803 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1b1c5d7a-7501-4c34-9823-c996a2413399/setup-container/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.157245 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3117828b-97c2-41b6-a48d-cf7154e2bb71/setup-container/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.181294 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1b1c5d7a-7501-4c34-9823-c996a2413399/setup-container/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.207905 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1b1c5d7a-7501-4c34-9823-c996a2413399/rabbitmq/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.444283 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3117828b-97c2-41b6-a48d-cf7154e2bb71/rabbitmq/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.453572 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh_d20cbc69-15fd-45a3-95f9-d29078eb55c7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.463585 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3117828b-97c2-41b6-a48d-cf7154e2bb71/setup-container/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.676883 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ngr7f_27e3e050-d2b1-4bc2-b93c-4258f8f4a86d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.806234 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck_6e6ab146-318d-4a9d-860b-8c1d5c12fab9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:48 crc kubenswrapper[4799]: I0319 21:02:48.943126 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-cxss6_2c78b6c2-a079-427d-9ebe-f8250777e6bd/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.099284 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2jrjr_c2e0b37a-955c-4332-bae2-6a7ffd2712f4/ssh-known-hosts-edpm-deployment/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.312006 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75d564c56c-vzn6z_ed03c750-cae7-4181-8451-c88d57969c01/proxy-server/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.371211 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75d564c56c-vzn6z_ed03c750-cae7-4181-8451-c88d57969c01/proxy-httpd/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.414867 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-stws7_1f8bd39c-7709-4713-b7f6-9713873dae5b/swift-ring-rebalance/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.555245 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-auditor/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.621896 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-reaper/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.866441 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-server/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.888065 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-replicator/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.914935 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-auditor/0.log" Mar 19 21:02:49 crc kubenswrapper[4799]: I0319 21:02:49.969377 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-replicator/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.026128 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-server/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.094247 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-updater/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.143342 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-auditor/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.182773 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-expirer/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.294500 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-replicator/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.296418 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-server/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.354711 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-updater/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.478782 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/rsync/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.545702 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/swift-recon-cron/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.623548 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-j27sk_d77bbb41-e2be-4c81-b266-92ca0b6e8b44/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.747186 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66572ad6-a9d4-4dc7-ae3e-61a1d67d928a/tempest-tests-tempest-tests-runner/0.log" Mar 19 21:02:50 crc kubenswrapper[4799]: I0319 21:02:50.828399 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e3586638-0807-4ea9-9027-0b953c5ea3cb/test-operator-logs-container/0.log" Mar 19 21:02:51 crc kubenswrapper[4799]: I0319 21:02:51.005441 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz_4bb02004-3780-40e2-9e05-93d1791c0c16/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:02:51 crc kubenswrapper[4799]: E0319 21:02:51.598500 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-conmon-e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 21:02:58 crc kubenswrapper[4799]: I0319 21:02:58.623339 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d881c32e-3c0c-415a-aa56-6e70a316b015/memcached/0.log" Mar 19 21:02:59 crc kubenswrapper[4799]: I0319 21:02:59.116217 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:02:59 crc kubenswrapper[4799]: E0319 21:02:59.116812 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:03:01 crc kubenswrapper[4799]: E0319 21:03:01.815510 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-conmon-e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 21:03:12 crc kubenswrapper[4799]: E0319 21:03:12.059515 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-conmon-e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 21:03:13 crc kubenswrapper[4799]: I0319 21:03:13.123030 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:03:13 crc kubenswrapper[4799]: E0319 21:03:13.123548 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.227514 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/util/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.438137 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/util/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.443189 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/pull/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.447477 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/pull/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.649157 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/util/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.664349 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/pull/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.684235 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/extract/0.log" Mar 19 21:03:17 crc kubenswrapper[4799]: I0319 21:03:17.880946 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-jphcf_f5a6c547-0da9-4313-817a-9562fa9cb775/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.117573 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-6j7gd_8f841081-d1d8-464a-ae77-af76f0a109ea/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.287690 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-d5sq6_3e5cf32e-9b90-4518-86bb-5237dbf97e55/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.341295 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-f7g26_1e9b69cc-5dc0-400e-9894-7ff0b173e6cb/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.552376 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-kswht_1984ed7f-dd4a-43b7-b724-a902bccf7448/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.704294 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-xrrtz_fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.802036 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-rrp47_23a48119-c751-435f-9ed5-5a4b0dcf7ae0/manager/0.log" Mar 19 21:03:18 crc kubenswrapper[4799]: I0319 21:03:18.999029 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-cnhvk_771a024f-6f6e-43f1-82cb-076a70663c36/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.125102 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-ct2xj_52430659-ee8f-4143-a0cc-554487c4ee41/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.210297 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-rhbks_310bb10d-00e4-4135-826e-43f7ca17bdf1/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.371178 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-j6bcm_a0891d87-bed5-4b7b-bab9-653866be0678/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.448766 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-jqm8n_5dffef78-a7d7-400c-8e1c-80fd01df4f07/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.654267 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5s2dz_b1950b27-4b38-4bd5-b858-fcf5aa82d7fd/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.666413 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6wvnr_72be7e05-f329-4420-9632-3f6827c4e0e9/manager/0.log" Mar 19 21:03:19 crc kubenswrapper[4799]: I0319 21:03:19.915852 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74c4796899hpqp2_8fe20f94-3898-4826-8f94-a97f5d7619d6/manager/0.log" Mar 19 21:03:20 crc kubenswrapper[4799]: I0319 21:03:20.120623 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b85c4d696-47f2t_4fdcc365-1a41-47ce-8988-24a55f0bb8ac/operator/0.log" Mar 19 21:03:20 crc kubenswrapper[4799]: I0319 21:03:20.375594 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zw5gb_d4a4d93a-af9b-49a6-8786-34f07a5a4ba4/registry-server/0.log" Mar 19 21:03:20 crc kubenswrapper[4799]: I0319 21:03:20.513364 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-gjj5c_879a3d02-050b-44dc-95ff-f4a010fe4739/manager/0.log" Mar 19 21:03:20 crc kubenswrapper[4799]: I0319 21:03:20.592731 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-4z6tx_9ac71989-25c8-4255-8612-a9f736ab50a1/manager/0.log" Mar 19 21:03:20 crc kubenswrapper[4799]: I0319 21:03:20.847796 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2scch_f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b/operator/0.log" Mar 19 21:03:20 crc kubenswrapper[4799]: I0319 21:03:20.932667 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-jrtt2_85dcf72e-1669-4316-afe2-2dbf9059cd35/manager/0.log" Mar 19 21:03:21 crc kubenswrapper[4799]: I0319 21:03:21.104004 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-dhfqp_84e19238-e730-4bcf-9e09-1c6f3421a04d/manager/0.log" Mar 19 21:03:21 crc kubenswrapper[4799]: I0319 21:03:21.239847 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86bd8996f6-bbbxq_d0525132-b508-40b7-a9eb-4773cfde1c32/manager/0.log" Mar 19 21:03:21 crc kubenswrapper[4799]: I0319 21:03:21.249197 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-4v4kl_3cac2ce5-3a90-4588-88ba-11557915c62c/manager/0.log" Mar 19 21:03:21 crc kubenswrapper[4799]: I0319 21:03:21.353722 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-vp4rw_503586c4-3015-43d9-bb6e-56bef997c641/manager/0.log" Mar 19 21:03:22 crc kubenswrapper[4799]: E0319 21:03:22.269576 4799 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64efd86f_b043_4b0e_af5b_1f455bbaef15.slice/crio-conmon-e63772da3780b7bacfc7139c22a7a37c7939da7e21f1fba7b3d93f1c2b47102b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 21:03:26 crc kubenswrapper[4799]: I0319 21:03:26.115848 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:03:26 crc kubenswrapper[4799]: E0319 21:03:26.116523 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.236110 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qz4lk"] Mar 19 21:03:38 crc kubenswrapper[4799]: E0319 21:03:38.237303 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64efd86f-b043-4b0e-af5b-1f455bbaef15" containerName="container-00" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.237324 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="64efd86f-b043-4b0e-af5b-1f455bbaef15" containerName="container-00" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.237700 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="64efd86f-b043-4b0e-af5b-1f455bbaef15" containerName="container-00" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.240004 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.244467 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gpcx\" (UniqueName: \"kubernetes.io/projected/f0e26ff5-2c74-45b7-99a4-761342edc10c-kube-api-access-9gpcx\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.244568 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e26ff5-2c74-45b7-99a4-761342edc10c-utilities\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.244618 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e26ff5-2c74-45b7-99a4-761342edc10c-catalog-content\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.259241 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz4lk"] Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.346959 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gpcx\" (UniqueName: \"kubernetes.io/projected/f0e26ff5-2c74-45b7-99a4-761342edc10c-kube-api-access-9gpcx\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.347024 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e26ff5-2c74-45b7-99a4-761342edc10c-utilities\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.347058 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e26ff5-2c74-45b7-99a4-761342edc10c-catalog-content\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.347560 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e26ff5-2c74-45b7-99a4-761342edc10c-utilities\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.347616 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e26ff5-2c74-45b7-99a4-761342edc10c-catalog-content\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.374096 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gpcx\" (UniqueName: \"kubernetes.io/projected/f0e26ff5-2c74-45b7-99a4-761342edc10c-kube-api-access-9gpcx\") pod \"certified-operators-qz4lk\" (UID: \"f0e26ff5-2c74-45b7-99a4-761342edc10c\") " pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:38 crc kubenswrapper[4799]: I0319 21:03:38.559779 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:39 crc kubenswrapper[4799]: I0319 21:03:39.116959 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:03:39 crc kubenswrapper[4799]: E0319 21:03:39.117409 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:03:39 crc kubenswrapper[4799]: I0319 21:03:39.131638 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz4lk"] Mar 19 21:03:39 crc kubenswrapper[4799]: I0319 21:03:39.925742 4799 generic.go:334] "Generic (PLEG): container finished" podID="f0e26ff5-2c74-45b7-99a4-761342edc10c" containerID="3c1441317c84f09f8e6f9c901dfa51dd76faa3ddbfb32b0eb6abd26fb5c4d266" exitCode=0 Mar 19 21:03:39 crc kubenswrapper[4799]: I0319 21:03:39.926021 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz4lk" event={"ID":"f0e26ff5-2c74-45b7-99a4-761342edc10c","Type":"ContainerDied","Data":"3c1441317c84f09f8e6f9c901dfa51dd76faa3ddbfb32b0eb6abd26fb5c4d266"} Mar 19 21:03:39 crc kubenswrapper[4799]: I0319 21:03:39.926551 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz4lk" event={"ID":"f0e26ff5-2c74-45b7-99a4-761342edc10c","Type":"ContainerStarted","Data":"5ceaf00a9917801a76fa8d2b983185fd91dee09ea16b86b6e5b6c97c658997a3"} Mar 19 21:03:41 crc kubenswrapper[4799]: I0319 21:03:41.149943 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-drzzt_8dd97bd4-11f1-48a2-ba74-2eba33de161b/control-plane-machine-set-operator/0.log" Mar 19 21:03:41 crc kubenswrapper[4799]: I0319 21:03:41.264128 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6ddkj_0b71b47b-d667-49c1-ae5b-3326bcde5508/kube-rbac-proxy/0.log" Mar 19 21:03:41 crc kubenswrapper[4799]: I0319 21:03:41.310701 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6ddkj_0b71b47b-d667-49c1-ae5b-3326bcde5508/machine-api-operator/0.log" Mar 19 21:03:45 crc kubenswrapper[4799]: I0319 21:03:45.993890 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz4lk" event={"ID":"f0e26ff5-2c74-45b7-99a4-761342edc10c","Type":"ContainerStarted","Data":"e828bdde265b16030659f5555cc85d5be800eebfc2e68747335c1c19db5c46c3"} Mar 19 21:03:47 crc kubenswrapper[4799]: I0319 21:03:47.009176 4799 generic.go:334] "Generic (PLEG): container finished" podID="f0e26ff5-2c74-45b7-99a4-761342edc10c" containerID="e828bdde265b16030659f5555cc85d5be800eebfc2e68747335c1c19db5c46c3" exitCode=0 Mar 19 21:03:47 crc kubenswrapper[4799]: I0319 21:03:47.009257 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz4lk" event={"ID":"f0e26ff5-2c74-45b7-99a4-761342edc10c","Type":"ContainerDied","Data":"e828bdde265b16030659f5555cc85d5be800eebfc2e68747335c1c19db5c46c3"} Mar 19 21:03:53 crc kubenswrapper[4799]: I0319 21:03:53.069068 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz4lk" event={"ID":"f0e26ff5-2c74-45b7-99a4-761342edc10c","Type":"ContainerStarted","Data":"0573d6b1adef67b4589a7655fc5de5c1fa730b777ed74ead64167c99f8bcde5d"} Mar 19 21:03:53 crc kubenswrapper[4799]: I0319 21:03:53.104515 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qz4lk" podStartSLOduration=2.589145834 podStartE2EDuration="15.104492564s" podCreationTimestamp="2026-03-19 21:03:38 +0000 UTC" firstStartedPulling="2026-03-19 21:03:39.928132816 +0000 UTC m=+3497.534085888" lastFinishedPulling="2026-03-19 21:03:52.443479526 +0000 UTC m=+3510.049432618" observedRunningTime="2026-03-19 21:03:53.094637537 +0000 UTC m=+3510.700590629" watchObservedRunningTime="2026-03-19 21:03:53.104492564 +0000 UTC m=+3510.710445656" Mar 19 21:03:54 crc kubenswrapper[4799]: I0319 21:03:54.116608 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:03:54 crc kubenswrapper[4799]: E0319 21:03:54.117205 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:03:54 crc kubenswrapper[4799]: I0319 21:03:54.676526 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fv54b_5f8bca57-6368-4bd6-9d79-d0e640dd074f/cert-manager-controller/0.log" Mar 19 21:03:54 crc kubenswrapper[4799]: I0319 21:03:54.848957 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-9ppqb_a471aa18-d5fa-455b-b8a6-395717db50b9/cert-manager-cainjector/0.log" Mar 19 21:03:54 crc kubenswrapper[4799]: I0319 21:03:54.861560 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-jtk48_570da1bb-c2ff-40e5-a2b6-352d09168d6d/cert-manager-webhook/0.log" Mar 19 21:03:58 crc kubenswrapper[4799]: I0319 21:03:58.560963 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:58 crc kubenswrapper[4799]: I0319 21:03:58.561639 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:58 crc kubenswrapper[4799]: I0319 21:03:58.631128 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.175897 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qz4lk" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.260993 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz4lk"] Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.322010 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5gh9p"] Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.322306 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5gh9p" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="registry-server" containerID="cri-o://327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d" gracePeriod=2 Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.364840 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-5gh9p" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="registry-server" probeResult="failure" output="" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.782232 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.860151 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-catalog-content\") pod \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.860329 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-utilities\") pod \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.860436 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgc7k\" (UniqueName: \"kubernetes.io/projected/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-kube-api-access-qgc7k\") pod \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\" (UID: \"6a0a44ba-035d-4c3e-b653-5fc884ab78dc\") " Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.862011 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-utilities" (OuterVolumeSpecName: "utilities") pod "6a0a44ba-035d-4c3e-b653-5fc884ab78dc" (UID: "6a0a44ba-035d-4c3e-b653-5fc884ab78dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.881577 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-kube-api-access-qgc7k" (OuterVolumeSpecName: "kube-api-access-qgc7k") pod "6a0a44ba-035d-4c3e-b653-5fc884ab78dc" (UID: "6a0a44ba-035d-4c3e-b653-5fc884ab78dc"). InnerVolumeSpecName "kube-api-access-qgc7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.920930 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a0a44ba-035d-4c3e-b653-5fc884ab78dc" (UID: "6a0a44ba-035d-4c3e-b653-5fc884ab78dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.963007 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.963048 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgc7k\" (UniqueName: \"kubernetes.io/projected/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-kube-api-access-qgc7k\") on node \"crc\" DevicePath \"\"" Mar 19 21:03:59 crc kubenswrapper[4799]: I0319 21:03:59.963061 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0a44ba-035d-4c3e-b653-5fc884ab78dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.141166 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565904-gzkx5"] Mar 19 21:04:00 crc kubenswrapper[4799]: E0319 21:04:00.141621 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="registry-server" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.141640 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="registry-server" Mar 19 21:04:00 crc kubenswrapper[4799]: E0319 21:04:00.141660 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="extract-content" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.141668 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="extract-content" Mar 19 21:04:00 crc kubenswrapper[4799]: E0319 21:04:00.141686 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="extract-utilities" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.141694 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="extract-utilities" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.141975 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerName="registry-server" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.142712 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.145843 4799 generic.go:334] "Generic (PLEG): container finished" podID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" containerID="327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d" exitCode=0 Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.145878 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gh9p" event={"ID":"6a0a44ba-035d-4c3e-b653-5fc884ab78dc","Type":"ContainerDied","Data":"327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d"} Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.145908 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5gh9p" event={"ID":"6a0a44ba-035d-4c3e-b653-5fc884ab78dc","Type":"ContainerDied","Data":"d5360605e39a400471bb910988ef617129b7c1de2b889de25283514b4336178d"} Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.145917 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5gh9p" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.145925 4799 scope.go:117] "RemoveContainer" containerID="327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.148843 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.148855 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.149202 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565904-gzkx5"] Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.151615 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.179291 4799 scope.go:117] "RemoveContainer" containerID="57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.202591 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5gh9p"] Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.211502 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5gh9p"] Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.212541 4799 scope.go:117] "RemoveContainer" containerID="347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.256774 4799 scope.go:117] "RemoveContainer" containerID="327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d" Mar 19 21:04:00 crc kubenswrapper[4799]: E0319 21:04:00.257268 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d\": container with ID starting with 327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d not found: ID does not exist" containerID="327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.257295 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d"} err="failed to get container status \"327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d\": rpc error: code = NotFound desc = could not find container \"327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d\": container with ID starting with 327cbb3cdd25af3de5be325c1062275cd747539ca0e32bd5bde499be0df5ae6d not found: ID does not exist" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.257317 4799 scope.go:117] "RemoveContainer" containerID="57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3" Mar 19 21:04:00 crc kubenswrapper[4799]: E0319 21:04:00.257691 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3\": container with ID starting with 57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3 not found: ID does not exist" containerID="57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.257744 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3"} err="failed to get container status \"57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3\": rpc error: code = NotFound desc = could not find container \"57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3\": container with ID starting with 57a69099f4578cc4cf52daced19f3e1b21286274f3162d3b4dd8aa1d3f1e03f3 not found: ID does not exist" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.257779 4799 scope.go:117] "RemoveContainer" containerID="347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7" Mar 19 21:04:00 crc kubenswrapper[4799]: E0319 21:04:00.258071 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7\": container with ID starting with 347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7 not found: ID does not exist" containerID="347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.258104 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7"} err="failed to get container status \"347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7\": rpc error: code = NotFound desc = could not find container \"347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7\": container with ID starting with 347a203a2e0dbd248f1721ee1b8b64986fba406673ab191ac32fd4c83b010de7 not found: ID does not exist" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.268453 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqtb\" (UniqueName: \"kubernetes.io/projected/83822866-9772-43df-8f03-7093c5f1dba3-kube-api-access-wjqtb\") pod \"auto-csr-approver-29565904-gzkx5\" (UID: \"83822866-9772-43df-8f03-7093c5f1dba3\") " pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.370847 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqtb\" (UniqueName: \"kubernetes.io/projected/83822866-9772-43df-8f03-7093c5f1dba3-kube-api-access-wjqtb\") pod \"auto-csr-approver-29565904-gzkx5\" (UID: \"83822866-9772-43df-8f03-7093c5f1dba3\") " pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.387062 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqtb\" (UniqueName: \"kubernetes.io/projected/83822866-9772-43df-8f03-7093c5f1dba3-kube-api-access-wjqtb\") pod \"auto-csr-approver-29565904-gzkx5\" (UID: \"83822866-9772-43df-8f03-7093c5f1dba3\") " pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.470329 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:00 crc kubenswrapper[4799]: I0319 21:04:00.971954 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565904-gzkx5"] Mar 19 21:04:01 crc kubenswrapper[4799]: I0319 21:04:01.128171 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0a44ba-035d-4c3e-b653-5fc884ab78dc" path="/var/lib/kubelet/pods/6a0a44ba-035d-4c3e-b653-5fc884ab78dc/volumes" Mar 19 21:04:01 crc kubenswrapper[4799]: I0319 21:04:01.155333 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" event={"ID":"83822866-9772-43df-8f03-7093c5f1dba3","Type":"ContainerStarted","Data":"b92cea830720461d6f1f049cb8a12029fb7228eb17244b4eeb52f088ec3716f5"} Mar 19 21:04:03 crc kubenswrapper[4799]: I0319 21:04:03.183544 4799 generic.go:334] "Generic (PLEG): container finished" podID="83822866-9772-43df-8f03-7093c5f1dba3" containerID="839f5503683b265796d728638aa312c27694ff772c60ad0d4b4f432846d58d0c" exitCode=0 Mar 19 21:04:03 crc kubenswrapper[4799]: I0319 21:04:03.184092 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" event={"ID":"83822866-9772-43df-8f03-7093c5f1dba3","Type":"ContainerDied","Data":"839f5503683b265796d728638aa312c27694ff772c60ad0d4b4f432846d58d0c"} Mar 19 21:04:04 crc kubenswrapper[4799]: I0319 21:04:04.608194 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:04 crc kubenswrapper[4799]: I0319 21:04:04.768987 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjqtb\" (UniqueName: \"kubernetes.io/projected/83822866-9772-43df-8f03-7093c5f1dba3-kube-api-access-wjqtb\") pod \"83822866-9772-43df-8f03-7093c5f1dba3\" (UID: \"83822866-9772-43df-8f03-7093c5f1dba3\") " Mar 19 21:04:04 crc kubenswrapper[4799]: I0319 21:04:04.785234 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83822866-9772-43df-8f03-7093c5f1dba3-kube-api-access-wjqtb" (OuterVolumeSpecName: "kube-api-access-wjqtb") pod "83822866-9772-43df-8f03-7093c5f1dba3" (UID: "83822866-9772-43df-8f03-7093c5f1dba3"). InnerVolumeSpecName "kube-api-access-wjqtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:04:04 crc kubenswrapper[4799]: I0319 21:04:04.871814 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjqtb\" (UniqueName: \"kubernetes.io/projected/83822866-9772-43df-8f03-7093c5f1dba3-kube-api-access-wjqtb\") on node \"crc\" DevicePath \"\"" Mar 19 21:04:05 crc kubenswrapper[4799]: I0319 21:04:05.200820 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" event={"ID":"83822866-9772-43df-8f03-7093c5f1dba3","Type":"ContainerDied","Data":"b92cea830720461d6f1f049cb8a12029fb7228eb17244b4eeb52f088ec3716f5"} Mar 19 21:04:05 crc kubenswrapper[4799]: I0319 21:04:05.200860 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b92cea830720461d6f1f049cb8a12029fb7228eb17244b4eeb52f088ec3716f5" Mar 19 21:04:05 crc kubenswrapper[4799]: I0319 21:04:05.200874 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565904-gzkx5" Mar 19 21:04:05 crc kubenswrapper[4799]: I0319 21:04:05.700061 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565898-vcv8t"] Mar 19 21:04:05 crc kubenswrapper[4799]: I0319 21:04:05.720801 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565898-vcv8t"] Mar 19 21:04:07 crc kubenswrapper[4799]: I0319 21:04:07.134310 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4948af39-6193-464c-8a26-94fe04ea30d6" path="/var/lib/kubelet/pods/4948af39-6193-464c-8a26-94fe04ea30d6/volumes" Mar 19 21:04:08 crc kubenswrapper[4799]: I0319 21:04:08.116127 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:04:09 crc kubenswrapper[4799]: I0319 21:04:09.249721 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"2b31612692af021bb5f91c7f48d51540f477275a528f51f53a0921d879975895"} Mar 19 21:04:09 crc kubenswrapper[4799]: I0319 21:04:09.553036 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-9s6cx_540372f2-ca9d-47b0-aaa6-86831627cd8e/nmstate-console-plugin/0.log" Mar 19 21:04:09 crc kubenswrapper[4799]: I0319 21:04:09.898004 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qqjcl_7c8d90dd-d173-4fd7-a3ae-ed312bc20861/nmstate-handler/0.log" Mar 19 21:04:10 crc kubenswrapper[4799]: I0319 21:04:10.074845 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qzf64_dab69c67-b7fc-4f89-93c6-6ee825d89b7d/kube-rbac-proxy/0.log" Mar 19 21:04:10 crc kubenswrapper[4799]: I0319 21:04:10.212901 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qzf64_dab69c67-b7fc-4f89-93c6-6ee825d89b7d/nmstate-metrics/0.log" Mar 19 21:04:10 crc kubenswrapper[4799]: I0319 21:04:10.334852 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-29txn_44b60556-07ce-4245-a994-dded304e075b/nmstate-operator/0.log" Mar 19 21:04:10 crc kubenswrapper[4799]: I0319 21:04:10.429315 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-psznr_dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6/nmstate-webhook/0.log" Mar 19 21:04:22 crc kubenswrapper[4799]: I0319 21:04:22.004410 4799 scope.go:117] "RemoveContainer" containerID="bfe3e4a0377ff1424dcfe0532b81f80d51f7606cace805d2cd3ad95d7aba3491" Mar 19 21:04:39 crc kubenswrapper[4799]: I0319 21:04:39.704533 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-9nd8c_8a6fd137-8e20-4043-b746-7d4b884ffc5a/kube-rbac-proxy/0.log" Mar 19 21:04:39 crc kubenswrapper[4799]: I0319 21:04:39.785636 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-9nd8c_8a6fd137-8e20-4043-b746-7d4b884ffc5a/controller/0.log" Mar 19 21:04:39 crc kubenswrapper[4799]: I0319 21:04:39.909288 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.087856 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.089481 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.127207 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.130198 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.298033 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.301146 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.375685 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.396287 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.473085 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.497567 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.561604 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.606206 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/controller/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.679567 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/frr-metrics/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.750486 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/kube-rbac-proxy/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.826041 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/kube-rbac-proxy-frr/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.864104 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/reloader/0.log" Mar 19 21:04:40 crc kubenswrapper[4799]: I0319 21:04:40.992742 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-2gv7v_b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a/frr-k8s-webhook-server/0.log" Mar 19 21:04:41 crc kubenswrapper[4799]: I0319 21:04:41.161081 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5578d7df77-xlzz9_993c9a96-b852-40c4-87e6-02e706b89b25/manager/0.log" Mar 19 21:04:41 crc kubenswrapper[4799]: I0319 21:04:41.274965 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59fdf54f4b-tp45h_0ec1faac-95e3-4189-bbfd-acc0f4662787/webhook-server/0.log" Mar 19 21:04:41 crc kubenswrapper[4799]: I0319 21:04:41.457636 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m865t_f71d2873-ac06-4cca-b70c-162e283e23b8/kube-rbac-proxy/0.log" Mar 19 21:04:41 crc kubenswrapper[4799]: I0319 21:04:41.906445 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m865t_f71d2873-ac06-4cca-b70c-162e283e23b8/speaker/0.log" Mar 19 21:04:42 crc kubenswrapper[4799]: I0319 21:04:42.112407 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/frr/0.log" Mar 19 21:04:55 crc kubenswrapper[4799]: I0319 21:04:55.863588 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/util/0.log" Mar 19 21:04:55 crc kubenswrapper[4799]: I0319 21:04:55.988218 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/util/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.073021 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/pull/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.094119 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/pull/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.259194 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/pull/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.270136 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/extract/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.294263 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/util/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.418069 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/util/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.583209 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/util/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.609449 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/pull/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.621918 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/pull/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.757244 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/util/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.781235 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/pull/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.810786 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/extract/0.log" Mar 19 21:04:56 crc kubenswrapper[4799]: I0319 21:04:56.918563 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-utilities/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.094601 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-content/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.094914 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-utilities/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.150804 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-content/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.284205 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-utilities/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.303435 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-content/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.474304 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/registry-server/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.529474 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-utilities/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.692090 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-utilities/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.697239 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-content/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.697993 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-content/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.849140 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-utilities/0.log" Mar 19 21:04:57 crc kubenswrapper[4799]: I0319 21:04:57.884190 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-content/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.088480 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s6hbz_40e9b27a-0c5f-45a3-b424-d1b289b65167/marketplace-operator/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.164900 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-utilities/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.385311 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-content/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.396234 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-utilities/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.408965 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-content/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.448273 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/registry-server/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.573583 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-content/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.597878 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-utilities/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.709151 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/registry-server/0.log" Mar 19 21:04:58 crc kubenswrapper[4799]: I0319 21:04:58.852426 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-utilities/0.log" Mar 19 21:04:59 crc kubenswrapper[4799]: I0319 21:04:59.018830 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-content/0.log" Mar 19 21:04:59 crc kubenswrapper[4799]: I0319 21:04:59.025465 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-utilities/0.log" Mar 19 21:04:59 crc kubenswrapper[4799]: I0319 21:04:59.050901 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-content/0.log" Mar 19 21:04:59 crc kubenswrapper[4799]: I0319 21:04:59.228112 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-utilities/0.log" Mar 19 21:04:59 crc kubenswrapper[4799]: I0319 21:04:59.231942 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-content/0.log" Mar 19 21:04:59 crc kubenswrapper[4799]: I0319 21:04:59.792715 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/registry-server/0.log" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.014470 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-swqdv"] Mar 19 21:05:37 crc kubenswrapper[4799]: E0319 21:05:37.016186 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83822866-9772-43df-8f03-7093c5f1dba3" containerName="oc" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.016225 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="83822866-9772-43df-8f03-7093c5f1dba3" containerName="oc" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.016729 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="83822866-9772-43df-8f03-7093c5f1dba3" containerName="oc" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.020053 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.034482 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-swqdv"] Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.117289 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj5mj\" (UniqueName: \"kubernetes.io/projected/a38874d7-7d3a-4de9-ae23-522156ba72e2-kube-api-access-fj5mj\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.117447 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-utilities\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.117505 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-catalog-content\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.219580 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj5mj\" (UniqueName: \"kubernetes.io/projected/a38874d7-7d3a-4de9-ae23-522156ba72e2-kube-api-access-fj5mj\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.222354 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-utilities\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.222885 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-utilities\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.223945 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-catalog-content\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.224571 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-catalog-content\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.241505 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj5mj\" (UniqueName: \"kubernetes.io/projected/a38874d7-7d3a-4de9-ae23-522156ba72e2-kube-api-access-fj5mj\") pod \"redhat-marketplace-swqdv\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.375069 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:37 crc kubenswrapper[4799]: I0319 21:05:37.920322 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-swqdv"] Mar 19 21:05:38 crc kubenswrapper[4799]: I0319 21:05:38.083913 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swqdv" event={"ID":"a38874d7-7d3a-4de9-ae23-522156ba72e2","Type":"ContainerStarted","Data":"143d0d03ff9742e46cba5aa1fdc9c60ad7ebf5587fe1f396f85bded873717ec1"} Mar 19 21:05:39 crc kubenswrapper[4799]: I0319 21:05:39.096757 4799 generic.go:334] "Generic (PLEG): container finished" podID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerID="5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c" exitCode=0 Mar 19 21:05:39 crc kubenswrapper[4799]: I0319 21:05:39.096842 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swqdv" event={"ID":"a38874d7-7d3a-4de9-ae23-522156ba72e2","Type":"ContainerDied","Data":"5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c"} Mar 19 21:05:39 crc kubenswrapper[4799]: I0319 21:05:39.099926 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 21:05:41 crc kubenswrapper[4799]: I0319 21:05:41.135552 4799 generic.go:334] "Generic (PLEG): container finished" podID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerID="4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656" exitCode=0 Mar 19 21:05:41 crc kubenswrapper[4799]: I0319 21:05:41.136150 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swqdv" event={"ID":"a38874d7-7d3a-4de9-ae23-522156ba72e2","Type":"ContainerDied","Data":"4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656"} Mar 19 21:05:42 crc kubenswrapper[4799]: I0319 21:05:42.149712 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swqdv" event={"ID":"a38874d7-7d3a-4de9-ae23-522156ba72e2","Type":"ContainerStarted","Data":"0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77"} Mar 19 21:05:42 crc kubenswrapper[4799]: I0319 21:05:42.200278 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-swqdv" podStartSLOduration=3.765712707 podStartE2EDuration="6.200248821s" podCreationTimestamp="2026-03-19 21:05:36 +0000 UTC" firstStartedPulling="2026-03-19 21:05:39.099564534 +0000 UTC m=+3616.705517646" lastFinishedPulling="2026-03-19 21:05:41.534100698 +0000 UTC m=+3619.140053760" observedRunningTime="2026-03-19 21:05:42.186951816 +0000 UTC m=+3619.792904908" watchObservedRunningTime="2026-03-19 21:05:42.200248821 +0000 UTC m=+3619.806201943" Mar 19 21:05:47 crc kubenswrapper[4799]: I0319 21:05:47.375789 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:47 crc kubenswrapper[4799]: I0319 21:05:47.376540 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:47 crc kubenswrapper[4799]: I0319 21:05:47.461844 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:48 crc kubenswrapper[4799]: I0319 21:05:48.289554 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:48 crc kubenswrapper[4799]: I0319 21:05:48.357180 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-swqdv"] Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.246050 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-swqdv" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="registry-server" containerID="cri-o://0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77" gracePeriod=2 Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.852137 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.918227 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-utilities\") pod \"a38874d7-7d3a-4de9-ae23-522156ba72e2\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.918494 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-catalog-content\") pod \"a38874d7-7d3a-4de9-ae23-522156ba72e2\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.918598 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj5mj\" (UniqueName: \"kubernetes.io/projected/a38874d7-7d3a-4de9-ae23-522156ba72e2-kube-api-access-fj5mj\") pod \"a38874d7-7d3a-4de9-ae23-522156ba72e2\" (UID: \"a38874d7-7d3a-4de9-ae23-522156ba72e2\") " Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.919459 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-utilities" (OuterVolumeSpecName: "utilities") pod "a38874d7-7d3a-4de9-ae23-522156ba72e2" (UID: "a38874d7-7d3a-4de9-ae23-522156ba72e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.926257 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a38874d7-7d3a-4de9-ae23-522156ba72e2-kube-api-access-fj5mj" (OuterVolumeSpecName: "kube-api-access-fj5mj") pod "a38874d7-7d3a-4de9-ae23-522156ba72e2" (UID: "a38874d7-7d3a-4de9-ae23-522156ba72e2"). InnerVolumeSpecName "kube-api-access-fj5mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:05:50 crc kubenswrapper[4799]: I0319 21:05:50.954416 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a38874d7-7d3a-4de9-ae23-522156ba72e2" (UID: "a38874d7-7d3a-4de9-ae23-522156ba72e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.021008 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj5mj\" (UniqueName: \"kubernetes.io/projected/a38874d7-7d3a-4de9-ae23-522156ba72e2-kube-api-access-fj5mj\") on node \"crc\" DevicePath \"\"" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.021042 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.021053 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a38874d7-7d3a-4de9-ae23-522156ba72e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.284571 4799 generic.go:334] "Generic (PLEG): container finished" podID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerID="0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77" exitCode=0 Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.284837 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swqdv" event={"ID":"a38874d7-7d3a-4de9-ae23-522156ba72e2","Type":"ContainerDied","Data":"0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77"} Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.284863 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swqdv" event={"ID":"a38874d7-7d3a-4de9-ae23-522156ba72e2","Type":"ContainerDied","Data":"143d0d03ff9742e46cba5aa1fdc9c60ad7ebf5587fe1f396f85bded873717ec1"} Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.284881 4799 scope.go:117] "RemoveContainer" containerID="0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.285015 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swqdv" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.318680 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-swqdv"] Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.318732 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-swqdv"] Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.327992 4799 scope.go:117] "RemoveContainer" containerID="4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.351755 4799 scope.go:117] "RemoveContainer" containerID="5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.397715 4799 scope.go:117] "RemoveContainer" containerID="0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77" Mar 19 21:05:51 crc kubenswrapper[4799]: E0319 21:05:51.398989 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77\": container with ID starting with 0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77 not found: ID does not exist" containerID="0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.399033 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77"} err="failed to get container status \"0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77\": rpc error: code = NotFound desc = could not find container \"0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77\": container with ID starting with 0a8860134d6fd270411d30dc29acdfa0550bae064ac418db21af3483bb780d77 not found: ID does not exist" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.399054 4799 scope.go:117] "RemoveContainer" containerID="4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656" Mar 19 21:05:51 crc kubenswrapper[4799]: E0319 21:05:51.401836 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656\": container with ID starting with 4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656 not found: ID does not exist" containerID="4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.401997 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656"} err="failed to get container status \"4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656\": rpc error: code = NotFound desc = could not find container \"4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656\": container with ID starting with 4c63dc444c7e246375885413bb2a84b8368e2163a11ebf3491ed59c2f3630656 not found: ID does not exist" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.402121 4799 scope.go:117] "RemoveContainer" containerID="5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c" Mar 19 21:05:51 crc kubenswrapper[4799]: E0319 21:05:51.403729 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c\": container with ID starting with 5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c not found: ID does not exist" containerID="5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c" Mar 19 21:05:51 crc kubenswrapper[4799]: I0319 21:05:51.403806 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c"} err="failed to get container status \"5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c\": rpc error: code = NotFound desc = could not find container \"5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c\": container with ID starting with 5ae0a8e0210343740417407d026acbf4748c09d88eaff7f78dad8d1bbd31071c not found: ID does not exist" Mar 19 21:05:53 crc kubenswrapper[4799]: I0319 21:05:53.129859 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" path="/var/lib/kubelet/pods/a38874d7-7d3a-4de9-ae23-522156ba72e2/volumes" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.165802 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565906-nmp4d"] Mar 19 21:06:00 crc kubenswrapper[4799]: E0319 21:06:00.166895 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="registry-server" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.166918 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="registry-server" Mar 19 21:06:00 crc kubenswrapper[4799]: E0319 21:06:00.166944 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="extract-utilities" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.166958 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="extract-utilities" Mar 19 21:06:00 crc kubenswrapper[4799]: E0319 21:06:00.166985 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="extract-content" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.166997 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="extract-content" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.167336 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="a38874d7-7d3a-4de9-ae23-522156ba72e2" containerName="registry-server" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.168243 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.170653 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.171942 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.173376 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.182323 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565906-nmp4d"] Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.323134 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2w6\" (UniqueName: \"kubernetes.io/projected/5babc98b-61ae-4dc4-a213-b01a732a7a1a-kube-api-access-lq2w6\") pod \"auto-csr-approver-29565906-nmp4d\" (UID: \"5babc98b-61ae-4dc4-a213-b01a732a7a1a\") " pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.424966 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2w6\" (UniqueName: \"kubernetes.io/projected/5babc98b-61ae-4dc4-a213-b01a732a7a1a-kube-api-access-lq2w6\") pod \"auto-csr-approver-29565906-nmp4d\" (UID: \"5babc98b-61ae-4dc4-a213-b01a732a7a1a\") " pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.450359 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2w6\" (UniqueName: \"kubernetes.io/projected/5babc98b-61ae-4dc4-a213-b01a732a7a1a-kube-api-access-lq2w6\") pod \"auto-csr-approver-29565906-nmp4d\" (UID: \"5babc98b-61ae-4dc4-a213-b01a732a7a1a\") " pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:00 crc kubenswrapper[4799]: I0319 21:06:00.499480 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:01 crc kubenswrapper[4799]: I0319 21:06:01.006250 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565906-nmp4d"] Mar 19 21:06:01 crc kubenswrapper[4799]: W0319 21:06:01.017546 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5babc98b_61ae_4dc4_a213_b01a732a7a1a.slice/crio-744f81e3c568f7baa61f61facb5346f2a1c49ddaf358abbd3cd49e7f49117408 WatchSource:0}: Error finding container 744f81e3c568f7baa61f61facb5346f2a1c49ddaf358abbd3cd49e7f49117408: Status 404 returned error can't find the container with id 744f81e3c568f7baa61f61facb5346f2a1c49ddaf358abbd3cd49e7f49117408 Mar 19 21:06:01 crc kubenswrapper[4799]: I0319 21:06:01.429799 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" event={"ID":"5babc98b-61ae-4dc4-a213-b01a732a7a1a","Type":"ContainerStarted","Data":"744f81e3c568f7baa61f61facb5346f2a1c49ddaf358abbd3cd49e7f49117408"} Mar 19 21:06:03 crc kubenswrapper[4799]: I0319 21:06:03.452590 4799 generic.go:334] "Generic (PLEG): container finished" podID="5babc98b-61ae-4dc4-a213-b01a732a7a1a" containerID="c697b47b06a97066b2cdb12e98f73f6b5beced493308146ac5cbaa7024c6bc91" exitCode=0 Mar 19 21:06:03 crc kubenswrapper[4799]: I0319 21:06:03.452651 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" event={"ID":"5babc98b-61ae-4dc4-a213-b01a732a7a1a","Type":"ContainerDied","Data":"c697b47b06a97066b2cdb12e98f73f6b5beced493308146ac5cbaa7024c6bc91"} Mar 19 21:06:04 crc kubenswrapper[4799]: I0319 21:06:04.892145 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:04 crc kubenswrapper[4799]: I0319 21:06:04.931359 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq2w6\" (UniqueName: \"kubernetes.io/projected/5babc98b-61ae-4dc4-a213-b01a732a7a1a-kube-api-access-lq2w6\") pod \"5babc98b-61ae-4dc4-a213-b01a732a7a1a\" (UID: \"5babc98b-61ae-4dc4-a213-b01a732a7a1a\") " Mar 19 21:06:04 crc kubenswrapper[4799]: I0319 21:06:04.965269 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5babc98b-61ae-4dc4-a213-b01a732a7a1a-kube-api-access-lq2w6" (OuterVolumeSpecName: "kube-api-access-lq2w6") pod "5babc98b-61ae-4dc4-a213-b01a732a7a1a" (UID: "5babc98b-61ae-4dc4-a213-b01a732a7a1a"). InnerVolumeSpecName "kube-api-access-lq2w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:06:05 crc kubenswrapper[4799]: I0319 21:06:05.034191 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq2w6\" (UniqueName: \"kubernetes.io/projected/5babc98b-61ae-4dc4-a213-b01a732a7a1a-kube-api-access-lq2w6\") on node \"crc\" DevicePath \"\"" Mar 19 21:06:05 crc kubenswrapper[4799]: I0319 21:06:05.487219 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" Mar 19 21:06:05 crc kubenswrapper[4799]: I0319 21:06:05.487240 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565906-nmp4d" event={"ID":"5babc98b-61ae-4dc4-a213-b01a732a7a1a","Type":"ContainerDied","Data":"744f81e3c568f7baa61f61facb5346f2a1c49ddaf358abbd3cd49e7f49117408"} Mar 19 21:06:05 crc kubenswrapper[4799]: I0319 21:06:05.487866 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="744f81e3c568f7baa61f61facb5346f2a1c49ddaf358abbd3cd49e7f49117408" Mar 19 21:06:05 crc kubenswrapper[4799]: I0319 21:06:05.999040 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565900-dw4qs"] Mar 19 21:06:06 crc kubenswrapper[4799]: I0319 21:06:06.010215 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565900-dw4qs"] Mar 19 21:06:07 crc kubenswrapper[4799]: I0319 21:06:07.136310 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb8e45b9-668d-422d-ae1c-2ca61c39401c" path="/var/lib/kubelet/pods/cb8e45b9-668d-422d-ae1c-2ca61c39401c/volumes" Mar 19 21:06:22 crc kubenswrapper[4799]: I0319 21:06:22.121800 4799 scope.go:117] "RemoveContainer" containerID="03979ec3cad5ccac0e7d0c965d1a524b71a193e9f69bc619e906a545d25a1cd0" Mar 19 21:06:28 crc kubenswrapper[4799]: I0319 21:06:28.756035 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 21:06:28 crc kubenswrapper[4799]: I0319 21:06:28.756672 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 21:06:42 crc kubenswrapper[4799]: I0319 21:06:42.985737 4799 generic.go:334] "Generic (PLEG): container finished" podID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerID="02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5" exitCode=0 Mar 19 21:06:42 crc kubenswrapper[4799]: I0319 21:06:42.985861 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9466g/must-gather-l749f" event={"ID":"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd","Type":"ContainerDied","Data":"02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5"} Mar 19 21:06:42 crc kubenswrapper[4799]: I0319 21:06:42.987238 4799 scope.go:117] "RemoveContainer" containerID="02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5" Mar 19 21:06:43 crc kubenswrapper[4799]: I0319 21:06:43.733535 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9466g_must-gather-l749f_7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd/gather/0.log" Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.294168 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9466g/must-gather-l749f"] Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.295128 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9466g/must-gather-l749f" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="copy" containerID="cri-o://552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24" gracePeriod=2 Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.304323 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9466g/must-gather-l749f"] Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.739564 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9466g_must-gather-l749f_7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd/copy/0.log" Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.740227 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.785707 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-must-gather-output\") pod \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.785791 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc2hp\" (UniqueName: \"kubernetes.io/projected/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-kube-api-access-dc2hp\") pod \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\" (UID: \"7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd\") " Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.793135 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-kube-api-access-dc2hp" (OuterVolumeSpecName: "kube-api-access-dc2hp") pod "7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" (UID: "7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd"). InnerVolumeSpecName "kube-api-access-dc2hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.887119 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc2hp\" (UniqueName: \"kubernetes.io/projected/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-kube-api-access-dc2hp\") on node \"crc\" DevicePath \"\"" Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.929250 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" (UID: "7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:06:52 crc kubenswrapper[4799]: I0319 21:06:52.988871 4799 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.108264 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9466g_must-gather-l749f_7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd/copy/0.log" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.109402 4799 generic.go:334] "Generic (PLEG): container finished" podID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerID="552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24" exitCode=143 Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.109451 4799 scope.go:117] "RemoveContainer" containerID="552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.109551 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9466g/must-gather-l749f" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.132500 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" path="/var/lib/kubelet/pods/7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd/volumes" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.139815 4799 scope.go:117] "RemoveContainer" containerID="02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.216544 4799 scope.go:117] "RemoveContainer" containerID="552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24" Mar 19 21:06:53 crc kubenswrapper[4799]: E0319 21:06:53.217812 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24\": container with ID starting with 552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24 not found: ID does not exist" containerID="552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.217851 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24"} err="failed to get container status \"552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24\": rpc error: code = NotFound desc = could not find container \"552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24\": container with ID starting with 552872ab474246658808f06d78c632ce04e61c0b98359472f317457bfd557b24 not found: ID does not exist" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.217877 4799 scope.go:117] "RemoveContainer" containerID="02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5" Mar 19 21:06:53 crc kubenswrapper[4799]: E0319 21:06:53.218741 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5\": container with ID starting with 02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5 not found: ID does not exist" containerID="02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5" Mar 19 21:06:53 crc kubenswrapper[4799]: I0319 21:06:53.218807 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5"} err="failed to get container status \"02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5\": rpc error: code = NotFound desc = could not find container \"02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5\": container with ID starting with 02015e42ebef6c87c859ee4b1a1167912ba0401ee21c06f0d6a1006201217ea5 not found: ID does not exist" Mar 19 21:06:58 crc kubenswrapper[4799]: I0319 21:06:58.756516 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 21:06:58 crc kubenswrapper[4799]: I0319 21:06:58.757315 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 21:07:28 crc kubenswrapper[4799]: I0319 21:07:28.755701 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 21:07:28 crc kubenswrapper[4799]: I0319 21:07:28.756433 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 21:07:28 crc kubenswrapper[4799]: I0319 21:07:28.756519 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 21:07:28 crc kubenswrapper[4799]: I0319 21:07:28.757661 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b31612692af021bb5f91c7f48d51540f477275a528f51f53a0921d879975895"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 21:07:28 crc kubenswrapper[4799]: I0319 21:07:28.757790 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://2b31612692af021bb5f91c7f48d51540f477275a528f51f53a0921d879975895" gracePeriod=600 Mar 19 21:07:29 crc kubenswrapper[4799]: I0319 21:07:29.523626 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="2b31612692af021bb5f91c7f48d51540f477275a528f51f53a0921d879975895" exitCode=0 Mar 19 21:07:29 crc kubenswrapper[4799]: I0319 21:07:29.523710 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"2b31612692af021bb5f91c7f48d51540f477275a528f51f53a0921d879975895"} Mar 19 21:07:29 crc kubenswrapper[4799]: I0319 21:07:29.524306 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8"} Mar 19 21:07:29 crc kubenswrapper[4799]: I0319 21:07:29.524328 4799 scope.go:117] "RemoveContainer" containerID="2e38920320be253d3afd66aa8260f2b38cf55292f10be6a616913a561adc5365" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.170083 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565908-mg7ws"] Mar 19 21:08:00 crc kubenswrapper[4799]: E0319 21:08:00.171493 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="copy" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.171520 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="copy" Mar 19 21:08:00 crc kubenswrapper[4799]: E0319 21:08:00.171551 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="gather" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.171563 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="gather" Mar 19 21:08:00 crc kubenswrapper[4799]: E0319 21:08:00.171605 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5babc98b-61ae-4dc4-a213-b01a732a7a1a" containerName="oc" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.171617 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5babc98b-61ae-4dc4-a213-b01a732a7a1a" containerName="oc" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.172004 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="gather" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.172067 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da49e7e-9b5f-4fc9-942c-5cf5b7a9bbdd" containerName="copy" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.172102 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="5babc98b-61ae-4dc4-a213-b01a732a7a1a" containerName="oc" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.173355 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.177223 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.177734 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.187035 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.191618 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565908-mg7ws"] Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.339215 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj66b\" (UniqueName: \"kubernetes.io/projected/72726a12-de75-4af2-be10-0aece53735ae-kube-api-access-fj66b\") pod \"auto-csr-approver-29565908-mg7ws\" (UID: \"72726a12-de75-4af2-be10-0aece53735ae\") " pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.441575 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj66b\" (UniqueName: \"kubernetes.io/projected/72726a12-de75-4af2-be10-0aece53735ae-kube-api-access-fj66b\") pod \"auto-csr-approver-29565908-mg7ws\" (UID: \"72726a12-de75-4af2-be10-0aece53735ae\") " pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.472869 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj66b\" (UniqueName: \"kubernetes.io/projected/72726a12-de75-4af2-be10-0aece53735ae-kube-api-access-fj66b\") pod \"auto-csr-approver-29565908-mg7ws\" (UID: \"72726a12-de75-4af2-be10-0aece53735ae\") " pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.505356 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:00 crc kubenswrapper[4799]: I0319 21:08:00.986301 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565908-mg7ws"] Mar 19 21:08:00 crc kubenswrapper[4799]: W0319 21:08:00.990918 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72726a12_de75_4af2_be10_0aece53735ae.slice/crio-2de58a92c9c90cc81896da74d719ee0de31f156d125ffeef9dcc5588ac04e319 WatchSource:0}: Error finding container 2de58a92c9c90cc81896da74d719ee0de31f156d125ffeef9dcc5588ac04e319: Status 404 returned error can't find the container with id 2de58a92c9c90cc81896da74d719ee0de31f156d125ffeef9dcc5588ac04e319 Mar 19 21:08:01 crc kubenswrapper[4799]: I0319 21:08:01.898609 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" event={"ID":"72726a12-de75-4af2-be10-0aece53735ae","Type":"ContainerStarted","Data":"2de58a92c9c90cc81896da74d719ee0de31f156d125ffeef9dcc5588ac04e319"} Mar 19 21:08:02 crc kubenswrapper[4799]: I0319 21:08:02.911396 4799 generic.go:334] "Generic (PLEG): container finished" podID="72726a12-de75-4af2-be10-0aece53735ae" containerID="34f5f7d8f558c4c1caef74b5c2de59a685314cd52232e096fa0e4251532464e8" exitCode=0 Mar 19 21:08:02 crc kubenswrapper[4799]: I0319 21:08:02.911499 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" event={"ID":"72726a12-de75-4af2-be10-0aece53735ae","Type":"ContainerDied","Data":"34f5f7d8f558c4c1caef74b5c2de59a685314cd52232e096fa0e4251532464e8"} Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.314939 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.434544 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj66b\" (UniqueName: \"kubernetes.io/projected/72726a12-de75-4af2-be10-0aece53735ae-kube-api-access-fj66b\") pod \"72726a12-de75-4af2-be10-0aece53735ae\" (UID: \"72726a12-de75-4af2-be10-0aece53735ae\") " Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.442524 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72726a12-de75-4af2-be10-0aece53735ae-kube-api-access-fj66b" (OuterVolumeSpecName: "kube-api-access-fj66b") pod "72726a12-de75-4af2-be10-0aece53735ae" (UID: "72726a12-de75-4af2-be10-0aece53735ae"). InnerVolumeSpecName "kube-api-access-fj66b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.538818 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj66b\" (UniqueName: \"kubernetes.io/projected/72726a12-de75-4af2-be10-0aece53735ae-kube-api-access-fj66b\") on node \"crc\" DevicePath \"\"" Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.933416 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" event={"ID":"72726a12-de75-4af2-be10-0aece53735ae","Type":"ContainerDied","Data":"2de58a92c9c90cc81896da74d719ee0de31f156d125ffeef9dcc5588ac04e319"} Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.933480 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2de58a92c9c90cc81896da74d719ee0de31f156d125ffeef9dcc5588ac04e319" Mar 19 21:08:04 crc kubenswrapper[4799]: I0319 21:08:04.933522 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565908-mg7ws" Mar 19 21:08:05 crc kubenswrapper[4799]: I0319 21:08:05.400430 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565902-qb8wl"] Mar 19 21:08:05 crc kubenswrapper[4799]: I0319 21:08:05.410620 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565902-qb8wl"] Mar 19 21:08:07 crc kubenswrapper[4799]: I0319 21:08:07.135602 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d5f387-63eb-4dce-b4c3-3ec1a012e0a7" path="/var/lib/kubelet/pods/13d5f387-63eb-4dce-b4c3-3ec1a012e0a7/volumes" Mar 19 21:08:22 crc kubenswrapper[4799]: I0319 21:08:22.271260 4799 scope.go:117] "RemoveContainer" containerID="3b20e0a7a7a8f6d54a23c6e836c4afc99b9bec34673fb2941fdb159bed185d00" Mar 19 21:08:22 crc kubenswrapper[4799]: I0319 21:08:22.307419 4799 scope.go:117] "RemoveContainer" containerID="7d6584362e2a0c54e63cc1c2f4237f56dadcdad7fec794dbc4163db6b3050511" Mar 19 21:08:22 crc kubenswrapper[4799]: I0319 21:08:22.374784 4799 scope.go:117] "RemoveContainer" containerID="4765a86c083715c0157d1fd86ab50831f58c139d05a62e294e09ba37ef00d353" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.314483 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8kxv/must-gather-xcmfm"] Mar 19 21:09:30 crc kubenswrapper[4799]: E0319 21:09:30.315363 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72726a12-de75-4af2-be10-0aece53735ae" containerName="oc" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.315374 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="72726a12-de75-4af2-be10-0aece53735ae" containerName="oc" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.315563 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="72726a12-de75-4af2-be10-0aece53735ae" containerName="oc" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.316570 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.319260 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b8kxv"/"openshift-service-ca.crt" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.319525 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b8kxv"/"kube-root-ca.crt" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.319536 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b8kxv"/"default-dockercfg-n9xm5" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.322370 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8kxv/must-gather-xcmfm"] Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.362948 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ac7839c-b697-426a-b4e6-50559cca8e79-must-gather-output\") pod \"must-gather-xcmfm\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.363005 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4htt\" (UniqueName: \"kubernetes.io/projected/2ac7839c-b697-426a-b4e6-50559cca8e79-kube-api-access-q4htt\") pod \"must-gather-xcmfm\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.464635 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ac7839c-b697-426a-b4e6-50559cca8e79-must-gather-output\") pod \"must-gather-xcmfm\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.464710 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4htt\" (UniqueName: \"kubernetes.io/projected/2ac7839c-b697-426a-b4e6-50559cca8e79-kube-api-access-q4htt\") pod \"must-gather-xcmfm\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.465082 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ac7839c-b697-426a-b4e6-50559cca8e79-must-gather-output\") pod \"must-gather-xcmfm\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.481888 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4htt\" (UniqueName: \"kubernetes.io/projected/2ac7839c-b697-426a-b4e6-50559cca8e79-kube-api-access-q4htt\") pod \"must-gather-xcmfm\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:30 crc kubenswrapper[4799]: I0319 21:09:30.635445 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:09:31 crc kubenswrapper[4799]: I0319 21:09:31.156848 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8kxv/must-gather-xcmfm"] Mar 19 21:09:32 crc kubenswrapper[4799]: I0319 21:09:32.026143 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" event={"ID":"2ac7839c-b697-426a-b4e6-50559cca8e79","Type":"ContainerStarted","Data":"ae87fb236e7291e3573e283542834c8c207af2be87c5357f6260ed83b1831af3"} Mar 19 21:09:32 crc kubenswrapper[4799]: I0319 21:09:32.026736 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" event={"ID":"2ac7839c-b697-426a-b4e6-50559cca8e79","Type":"ContainerStarted","Data":"17ce42f55b19e5a443b00a691e5a37bf87b31daea99677cbdf8502d6721ae952"} Mar 19 21:09:32 crc kubenswrapper[4799]: I0319 21:09:32.026749 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" event={"ID":"2ac7839c-b697-426a-b4e6-50559cca8e79","Type":"ContainerStarted","Data":"17639cab0f401fea4f54b2476815ac6dc01ac6b6169f9f01e4e4a79b741dab54"} Mar 19 21:09:32 crc kubenswrapper[4799]: I0319 21:09:32.056534 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" podStartSLOduration=2.056503873 podStartE2EDuration="2.056503873s" podCreationTimestamp="2026-03-19 21:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:09:32.042013353 +0000 UTC m=+3849.647966425" watchObservedRunningTime="2026-03-19 21:09:32.056503873 +0000 UTC m=+3849.662456955" Mar 19 21:09:34 crc kubenswrapper[4799]: E0319 21:09:34.405999 4799 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.107:51918->38.102.83.107:33291: write tcp 38.102.83.107:51918->38.102.83.107:33291: write: broken pipe Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.109833 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-djkt6"] Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.111407 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.176134 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64ce3e54-f619-41b8-b639-735d587ab61e-host\") pod \"crc-debug-djkt6\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.176217 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69x8b\" (UniqueName: \"kubernetes.io/projected/64ce3e54-f619-41b8-b639-735d587ab61e-kube-api-access-69x8b\") pod \"crc-debug-djkt6\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.278485 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64ce3e54-f619-41b8-b639-735d587ab61e-host\") pod \"crc-debug-djkt6\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.278607 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64ce3e54-f619-41b8-b639-735d587ab61e-host\") pod \"crc-debug-djkt6\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.278877 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69x8b\" (UniqueName: \"kubernetes.io/projected/64ce3e54-f619-41b8-b639-735d587ab61e-kube-api-access-69x8b\") pod \"crc-debug-djkt6\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.299758 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69x8b\" (UniqueName: \"kubernetes.io/projected/64ce3e54-f619-41b8-b639-735d587ab61e-kube-api-access-69x8b\") pod \"crc-debug-djkt6\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:35 crc kubenswrapper[4799]: I0319 21:09:35.439078 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:09:36 crc kubenswrapper[4799]: I0319 21:09:36.075943 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" event={"ID":"64ce3e54-f619-41b8-b639-735d587ab61e","Type":"ContainerStarted","Data":"618be630872dfbf30231362d3d16b6dc89594f38c3bb4104e058e28e59acd3f9"} Mar 19 21:09:36 crc kubenswrapper[4799]: I0319 21:09:36.076633 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" event={"ID":"64ce3e54-f619-41b8-b639-735d587ab61e","Type":"ContainerStarted","Data":"1eb58568def7c958a97f39d33d846fe9efc2caa01dbf896b51421195e1c510b9"} Mar 19 21:09:36 crc kubenswrapper[4799]: I0319 21:09:36.100842 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" podStartSLOduration=1.100822727 podStartE2EDuration="1.100822727s" podCreationTimestamp="2026-03-19 21:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:09:36.094468598 +0000 UTC m=+3853.700421690" watchObservedRunningTime="2026-03-19 21:09:36.100822727 +0000 UTC m=+3853.706775829" Mar 19 21:09:58 crc kubenswrapper[4799]: I0319 21:09:58.755827 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 21:09:58 crc kubenswrapper[4799]: I0319 21:09:58.756331 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.134542 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565910-57gjx"] Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.136152 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.138080 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.138460 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.138982 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.149696 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565910-57gjx"] Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.243372 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4jf\" (UniqueName: \"kubernetes.io/projected/e75cbce3-390b-43f5-9a1f-00fae47ac87a-kube-api-access-gs4jf\") pod \"auto-csr-approver-29565910-57gjx\" (UID: \"e75cbce3-390b-43f5-9a1f-00fae47ac87a\") " pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.345560 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4jf\" (UniqueName: \"kubernetes.io/projected/e75cbce3-390b-43f5-9a1f-00fae47ac87a-kube-api-access-gs4jf\") pod \"auto-csr-approver-29565910-57gjx\" (UID: \"e75cbce3-390b-43f5-9a1f-00fae47ac87a\") " pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.366100 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4jf\" (UniqueName: \"kubernetes.io/projected/e75cbce3-390b-43f5-9a1f-00fae47ac87a-kube-api-access-gs4jf\") pod \"auto-csr-approver-29565910-57gjx\" (UID: \"e75cbce3-390b-43f5-9a1f-00fae47ac87a\") " pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.495594 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:00 crc kubenswrapper[4799]: I0319 21:10:00.984419 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565910-57gjx"] Mar 19 21:10:01 crc kubenswrapper[4799]: I0319 21:10:01.287598 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565910-57gjx" event={"ID":"e75cbce3-390b-43f5-9a1f-00fae47ac87a","Type":"ContainerStarted","Data":"4b739b798801dc9b9ae417dbc4e1b928d95fad7ea76729bc1255f21e325d73ec"} Mar 19 21:10:03 crc kubenswrapper[4799]: I0319 21:10:03.308547 4799 generic.go:334] "Generic (PLEG): container finished" podID="e75cbce3-390b-43f5-9a1f-00fae47ac87a" containerID="79e64b595c0d7e76ec6508db0176eaaabd9a1256f1ca3fa82e9a975db0c85d18" exitCode=0 Mar 19 21:10:03 crc kubenswrapper[4799]: I0319 21:10:03.308772 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565910-57gjx" event={"ID":"e75cbce3-390b-43f5-9a1f-00fae47ac87a","Type":"ContainerDied","Data":"79e64b595c0d7e76ec6508db0176eaaabd9a1256f1ca3fa82e9a975db0c85d18"} Mar 19 21:10:04 crc kubenswrapper[4799]: I0319 21:10:04.685266 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:04 crc kubenswrapper[4799]: I0319 21:10:04.722944 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs4jf\" (UniqueName: \"kubernetes.io/projected/e75cbce3-390b-43f5-9a1f-00fae47ac87a-kube-api-access-gs4jf\") pod \"e75cbce3-390b-43f5-9a1f-00fae47ac87a\" (UID: \"e75cbce3-390b-43f5-9a1f-00fae47ac87a\") " Mar 19 21:10:04 crc kubenswrapper[4799]: I0319 21:10:04.744701 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75cbce3-390b-43f5-9a1f-00fae47ac87a-kube-api-access-gs4jf" (OuterVolumeSpecName: "kube-api-access-gs4jf") pod "e75cbce3-390b-43f5-9a1f-00fae47ac87a" (UID: "e75cbce3-390b-43f5-9a1f-00fae47ac87a"). InnerVolumeSpecName "kube-api-access-gs4jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:10:04 crc kubenswrapper[4799]: I0319 21:10:04.824700 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs4jf\" (UniqueName: \"kubernetes.io/projected/e75cbce3-390b-43f5-9a1f-00fae47ac87a-kube-api-access-gs4jf\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:05 crc kubenswrapper[4799]: I0319 21:10:05.328455 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565910-57gjx" event={"ID":"e75cbce3-390b-43f5-9a1f-00fae47ac87a","Type":"ContainerDied","Data":"4b739b798801dc9b9ae417dbc4e1b928d95fad7ea76729bc1255f21e325d73ec"} Mar 19 21:10:05 crc kubenswrapper[4799]: I0319 21:10:05.328487 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565910-57gjx" Mar 19 21:10:05 crc kubenswrapper[4799]: I0319 21:10:05.328494 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b739b798801dc9b9ae417dbc4e1b928d95fad7ea76729bc1255f21e325d73ec" Mar 19 21:10:05 crc kubenswrapper[4799]: I0319 21:10:05.754619 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565904-gzkx5"] Mar 19 21:10:05 crc kubenswrapper[4799]: I0319 21:10:05.762687 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565904-gzkx5"] Mar 19 21:10:07 crc kubenswrapper[4799]: I0319 21:10:07.126302 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83822866-9772-43df-8f03-7093c5f1dba3" path="/var/lib/kubelet/pods/83822866-9772-43df-8f03-7093c5f1dba3/volumes" Mar 19 21:10:08 crc kubenswrapper[4799]: I0319 21:10:08.353454 4799 generic.go:334] "Generic (PLEG): container finished" podID="64ce3e54-f619-41b8-b639-735d587ab61e" containerID="618be630872dfbf30231362d3d16b6dc89594f38c3bb4104e058e28e59acd3f9" exitCode=0 Mar 19 21:10:08 crc kubenswrapper[4799]: I0319 21:10:08.353658 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" event={"ID":"64ce3e54-f619-41b8-b639-735d587ab61e","Type":"ContainerDied","Data":"618be630872dfbf30231362d3d16b6dc89594f38c3bb4104e058e28e59acd3f9"} Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.463434 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.494069 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-djkt6"] Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.501927 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-djkt6"] Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.518587 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69x8b\" (UniqueName: \"kubernetes.io/projected/64ce3e54-f619-41b8-b639-735d587ab61e-kube-api-access-69x8b\") pod \"64ce3e54-f619-41b8-b639-735d587ab61e\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.518793 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64ce3e54-f619-41b8-b639-735d587ab61e-host\") pod \"64ce3e54-f619-41b8-b639-735d587ab61e\" (UID: \"64ce3e54-f619-41b8-b639-735d587ab61e\") " Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.518914 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64ce3e54-f619-41b8-b639-735d587ab61e-host" (OuterVolumeSpecName: "host") pod "64ce3e54-f619-41b8-b639-735d587ab61e" (UID: "64ce3e54-f619-41b8-b639-735d587ab61e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.519213 4799 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64ce3e54-f619-41b8-b639-735d587ab61e-host\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.526706 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ce3e54-f619-41b8-b639-735d587ab61e-kube-api-access-69x8b" (OuterVolumeSpecName: "kube-api-access-69x8b") pod "64ce3e54-f619-41b8-b639-735d587ab61e" (UID: "64ce3e54-f619-41b8-b639-735d587ab61e"). InnerVolumeSpecName "kube-api-access-69x8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:10:09 crc kubenswrapper[4799]: I0319 21:10:09.620518 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69x8b\" (UniqueName: \"kubernetes.io/projected/64ce3e54-f619-41b8-b639-735d587ab61e-kube-api-access-69x8b\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.373201 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb58568def7c958a97f39d33d846fe9efc2caa01dbf896b51421195e1c510b9" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.373300 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-djkt6" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.712030 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-tf4hd"] Mar 19 21:10:10 crc kubenswrapper[4799]: E0319 21:10:10.712749 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ce3e54-f619-41b8-b639-735d587ab61e" containerName="container-00" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.712766 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ce3e54-f619-41b8-b639-735d587ab61e" containerName="container-00" Mar 19 21:10:10 crc kubenswrapper[4799]: E0319 21:10:10.712799 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75cbce3-390b-43f5-9a1f-00fae47ac87a" containerName="oc" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.712811 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75cbce3-390b-43f5-9a1f-00fae47ac87a" containerName="oc" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.713079 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75cbce3-390b-43f5-9a1f-00fae47ac87a" containerName="oc" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.713116 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ce3e54-f619-41b8-b639-735d587ab61e" containerName="container-00" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.713868 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.845742 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v78ck\" (UniqueName: \"kubernetes.io/projected/8540f773-274c-4ae5-9890-e90c582bd7a0-kube-api-access-v78ck\") pod \"crc-debug-tf4hd\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.845824 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8540f773-274c-4ae5-9890-e90c582bd7a0-host\") pod \"crc-debug-tf4hd\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.947818 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v78ck\" (UniqueName: \"kubernetes.io/projected/8540f773-274c-4ae5-9890-e90c582bd7a0-kube-api-access-v78ck\") pod \"crc-debug-tf4hd\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.947920 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8540f773-274c-4ae5-9890-e90c582bd7a0-host\") pod \"crc-debug-tf4hd\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.948065 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8540f773-274c-4ae5-9890-e90c582bd7a0-host\") pod \"crc-debug-tf4hd\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:10 crc kubenswrapper[4799]: I0319 21:10:10.967933 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v78ck\" (UniqueName: \"kubernetes.io/projected/8540f773-274c-4ae5-9890-e90c582bd7a0-kube-api-access-v78ck\") pod \"crc-debug-tf4hd\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:11 crc kubenswrapper[4799]: I0319 21:10:11.035677 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:11 crc kubenswrapper[4799]: I0319 21:10:11.130827 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ce3e54-f619-41b8-b639-735d587ab61e" path="/var/lib/kubelet/pods/64ce3e54-f619-41b8-b639-735d587ab61e/volumes" Mar 19 21:10:11 crc kubenswrapper[4799]: I0319 21:10:11.383028 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" event={"ID":"8540f773-274c-4ae5-9890-e90c582bd7a0","Type":"ContainerStarted","Data":"1966c12b34a526de3afd55eec54f30ca10d2016bf32ade30143207199e6e38e6"} Mar 19 21:10:11 crc kubenswrapper[4799]: I0319 21:10:11.383425 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" event={"ID":"8540f773-274c-4ae5-9890-e90c582bd7a0","Type":"ContainerStarted","Data":"ea1fa35b07003e56c9a4655a405ae58c8c95d903c2587095a307d2ec35a4324e"} Mar 19 21:10:11 crc kubenswrapper[4799]: I0319 21:10:11.401085 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" podStartSLOduration=1.401066563 podStartE2EDuration="1.401066563s" podCreationTimestamp="2026-03-19 21:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:10:11.396100083 +0000 UTC m=+3889.002053175" watchObservedRunningTime="2026-03-19 21:10:11.401066563 +0000 UTC m=+3889.007019635" Mar 19 21:10:12 crc kubenswrapper[4799]: I0319 21:10:12.395975 4799 generic.go:334] "Generic (PLEG): container finished" podID="8540f773-274c-4ae5-9890-e90c582bd7a0" containerID="1966c12b34a526de3afd55eec54f30ca10d2016bf32ade30143207199e6e38e6" exitCode=0 Mar 19 21:10:12 crc kubenswrapper[4799]: I0319 21:10:12.396370 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" event={"ID":"8540f773-274c-4ae5-9890-e90c582bd7a0","Type":"ContainerDied","Data":"1966c12b34a526de3afd55eec54f30ca10d2016bf32ade30143207199e6e38e6"} Mar 19 21:10:13 crc kubenswrapper[4799]: I0319 21:10:13.894944 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:13 crc kubenswrapper[4799]: I0319 21:10:13.932360 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-tf4hd"] Mar 19 21:10:13 crc kubenswrapper[4799]: I0319 21:10:13.950458 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-tf4hd"] Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.008819 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v78ck\" (UniqueName: \"kubernetes.io/projected/8540f773-274c-4ae5-9890-e90c582bd7a0-kube-api-access-v78ck\") pod \"8540f773-274c-4ae5-9890-e90c582bd7a0\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.008882 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8540f773-274c-4ae5-9890-e90c582bd7a0-host\") pod \"8540f773-274c-4ae5-9890-e90c582bd7a0\" (UID: \"8540f773-274c-4ae5-9890-e90c582bd7a0\") " Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.009281 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8540f773-274c-4ae5-9890-e90c582bd7a0-host" (OuterVolumeSpecName: "host") pod "8540f773-274c-4ae5-9890-e90c582bd7a0" (UID: "8540f773-274c-4ae5-9890-e90c582bd7a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.009649 4799 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8540f773-274c-4ae5-9890-e90c582bd7a0-host\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.029820 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8540f773-274c-4ae5-9890-e90c582bd7a0-kube-api-access-v78ck" (OuterVolumeSpecName: "kube-api-access-v78ck") pod "8540f773-274c-4ae5-9890-e90c582bd7a0" (UID: "8540f773-274c-4ae5-9890-e90c582bd7a0"). InnerVolumeSpecName "kube-api-access-v78ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.111710 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v78ck\" (UniqueName: \"kubernetes.io/projected/8540f773-274c-4ae5-9890-e90c582bd7a0-kube-api-access-v78ck\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.415176 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea1fa35b07003e56c9a4655a405ae58c8c95d903c2587095a307d2ec35a4324e" Mar 19 21:10:14 crc kubenswrapper[4799]: I0319 21:10:14.415233 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-tf4hd" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.127938 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8540f773-274c-4ae5-9890-e90c582bd7a0" path="/var/lib/kubelet/pods/8540f773-274c-4ae5-9890-e90c582bd7a0/volumes" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.286327 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-wd94m"] Mar 19 21:10:15 crc kubenswrapper[4799]: E0319 21:10:15.286750 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8540f773-274c-4ae5-9890-e90c582bd7a0" containerName="container-00" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.286766 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="8540f773-274c-4ae5-9890-e90c582bd7a0" containerName="container-00" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.286984 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="8540f773-274c-4ae5-9890-e90c582bd7a0" containerName="container-00" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.287549 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.333171 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47e084bf-4096-4d33-90f7-2e8c8d02c223-host\") pod \"crc-debug-wd94m\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.333403 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mtw\" (UniqueName: \"kubernetes.io/projected/47e084bf-4096-4d33-90f7-2e8c8d02c223-kube-api-access-b4mtw\") pod \"crc-debug-wd94m\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.435160 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mtw\" (UniqueName: \"kubernetes.io/projected/47e084bf-4096-4d33-90f7-2e8c8d02c223-kube-api-access-b4mtw\") pod \"crc-debug-wd94m\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.435593 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47e084bf-4096-4d33-90f7-2e8c8d02c223-host\") pod \"crc-debug-wd94m\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.435736 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47e084bf-4096-4d33-90f7-2e8c8d02c223-host\") pod \"crc-debug-wd94m\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.454186 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mtw\" (UniqueName: \"kubernetes.io/projected/47e084bf-4096-4d33-90f7-2e8c8d02c223-kube-api-access-b4mtw\") pod \"crc-debug-wd94m\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: I0319 21:10:15.606570 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:15 crc kubenswrapper[4799]: W0319 21:10:15.635519 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47e084bf_4096_4d33_90f7_2e8c8d02c223.slice/crio-803b91c8de44d0d95a2d52ed20798bbc0a90efd2b575ae5eff65a83a00444b3f WatchSource:0}: Error finding container 803b91c8de44d0d95a2d52ed20798bbc0a90efd2b575ae5eff65a83a00444b3f: Status 404 returned error can't find the container with id 803b91c8de44d0d95a2d52ed20798bbc0a90efd2b575ae5eff65a83a00444b3f Mar 19 21:10:16 crc kubenswrapper[4799]: I0319 21:10:16.432220 4799 generic.go:334] "Generic (PLEG): container finished" podID="47e084bf-4096-4d33-90f7-2e8c8d02c223" containerID="0e7cd0d27ede34f36f49867b8ad7c4aef6597f1b05086430cf0dc17ba8053f81" exitCode=0 Mar 19 21:10:16 crc kubenswrapper[4799]: I0319 21:10:16.432310 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-wd94m" event={"ID":"47e084bf-4096-4d33-90f7-2e8c8d02c223","Type":"ContainerDied","Data":"0e7cd0d27ede34f36f49867b8ad7c4aef6597f1b05086430cf0dc17ba8053f81"} Mar 19 21:10:16 crc kubenswrapper[4799]: I0319 21:10:16.432559 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/crc-debug-wd94m" event={"ID":"47e084bf-4096-4d33-90f7-2e8c8d02c223","Type":"ContainerStarted","Data":"803b91c8de44d0d95a2d52ed20798bbc0a90efd2b575ae5eff65a83a00444b3f"} Mar 19 21:10:16 crc kubenswrapper[4799]: I0319 21:10:16.468113 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-wd94m"] Mar 19 21:10:16 crc kubenswrapper[4799]: I0319 21:10:16.474956 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8kxv/crc-debug-wd94m"] Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.537895 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.669649 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47e084bf-4096-4d33-90f7-2e8c8d02c223-host\") pod \"47e084bf-4096-4d33-90f7-2e8c8d02c223\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.669756 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47e084bf-4096-4d33-90f7-2e8c8d02c223-host" (OuterVolumeSpecName: "host") pod "47e084bf-4096-4d33-90f7-2e8c8d02c223" (UID: "47e084bf-4096-4d33-90f7-2e8c8d02c223"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.669838 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4mtw\" (UniqueName: \"kubernetes.io/projected/47e084bf-4096-4d33-90f7-2e8c8d02c223-kube-api-access-b4mtw\") pod \"47e084bf-4096-4d33-90f7-2e8c8d02c223\" (UID: \"47e084bf-4096-4d33-90f7-2e8c8d02c223\") " Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.670275 4799 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/47e084bf-4096-4d33-90f7-2e8c8d02c223-host\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.676575 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e084bf-4096-4d33-90f7-2e8c8d02c223-kube-api-access-b4mtw" (OuterVolumeSpecName: "kube-api-access-b4mtw") pod "47e084bf-4096-4d33-90f7-2e8c8d02c223" (UID: "47e084bf-4096-4d33-90f7-2e8c8d02c223"). InnerVolumeSpecName "kube-api-access-b4mtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:10:17 crc kubenswrapper[4799]: I0319 21:10:17.772195 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4mtw\" (UniqueName: \"kubernetes.io/projected/47e084bf-4096-4d33-90f7-2e8c8d02c223-kube-api-access-b4mtw\") on node \"crc\" DevicePath \"\"" Mar 19 21:10:18 crc kubenswrapper[4799]: I0319 21:10:18.456738 4799 scope.go:117] "RemoveContainer" containerID="0e7cd0d27ede34f36f49867b8ad7c4aef6597f1b05086430cf0dc17ba8053f81" Mar 19 21:10:18 crc kubenswrapper[4799]: I0319 21:10:18.456823 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/crc-debug-wd94m" Mar 19 21:10:19 crc kubenswrapper[4799]: I0319 21:10:19.126108 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e084bf-4096-4d33-90f7-2e8c8d02c223" path="/var/lib/kubelet/pods/47e084bf-4096-4d33-90f7-2e8c8d02c223/volumes" Mar 19 21:10:22 crc kubenswrapper[4799]: I0319 21:10:22.577070 4799 scope.go:117] "RemoveContainer" containerID="839f5503683b265796d728638aa312c27694ff772c60ad0d4b4f432846d58d0c" Mar 19 21:10:28 crc kubenswrapper[4799]: I0319 21:10:28.756413 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 21:10:28 crc kubenswrapper[4799]: I0319 21:10:28.757103 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 21:10:51 crc kubenswrapper[4799]: I0319 21:10:51.894572 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-868b778b64-pzgfd_c6e6e053-3361-4d22-9ef7-fd7e96b77cf4/barbican-api/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.043218 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-868b778b64-pzgfd_c6e6e053-3361-4d22-9ef7-fd7e96b77cf4/barbican-api-log/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.085641 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-776dcdd75d-ljvsd_4b710194-7925-42ab-b779-be3f32094307/barbican-keystone-listener/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.132798 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-776dcdd75d-ljvsd_4b710194-7925-42ab-b779-be3f32094307/barbican-keystone-listener-log/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.252192 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-784d54cf6f-wm6ts_c6596c03-a397-4f22-b511-86e89635a92a/barbican-worker/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.271677 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-784d54cf6f-wm6ts_c6596c03-a397-4f22-b511-86e89635a92a/barbican-worker-log/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.447743 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xqgb9_8befdbab-e306-4827-98a7-f042c02380ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.501262 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/ceilometer-central-agent/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.534207 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/ceilometer-notification-agent/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.621349 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/proxy-httpd/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.646316 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_509e207f-9e25-4446-a56c-871da702f099/sg-core/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.738573 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_01d335a7-09e5-4073-bf70-5ac03807ff12/cinder-api/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.794870 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_01d335a7-09e5-4073-bf70-5ac03807ff12/cinder-api-log/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.967162 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6c780df-41c8-47d8-af6d-3dcefb770b8d/probe/0.log" Mar 19 21:10:52 crc kubenswrapper[4799]: I0319 21:10:52.970701 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6c780df-41c8-47d8-af6d-3dcefb770b8d/cinder-scheduler/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.074991 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-zc2hw_8886a9f8-8f15-43f4-a721-3a487d2ff6f7/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.186669 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fcfgh_e06b9a2d-57e1-4c4d-a08e-d0c2c343130d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.314757 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79db78f56f-q2stc_9ae6606b-efd1-4bf8-a13f-c14d96bbaa99/init/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.490600 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79db78f56f-q2stc_9ae6606b-efd1-4bf8-a13f-c14d96bbaa99/init/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.540730 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79db78f56f-q2stc_9ae6606b-efd1-4bf8-a13f-c14d96bbaa99/dnsmasq-dns/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.562450 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vgxll_8e141b61-4ffc-407f-a554-58f8176b1b18/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.761453 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ec6464f3-4c36-4387-8127-56a300a1d79c/glance-httpd/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.784674 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ec6464f3-4c36-4387-8127-56a300a1d79c/glance-log/0.log" Mar 19 21:10:53 crc kubenswrapper[4799]: I0319 21:10:53.950222 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a4455494-ef4e-4f95-87d6-cd495059bb9a/glance-httpd/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.018167 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a4455494-ef4e-4f95-87d6-cd495059bb9a/glance-log/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.170164 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56454c8868-kxl79_d9b7bec9-2633-410d-be4e-c65c9a903a38/horizon/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.327089 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nc852_9e20098d-7fb1-4aba-b460-1440751e6dc2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.520754 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56454c8868-kxl79_d9b7bec9-2633-410d-be4e-c65c9a903a38/horizon-log/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.572719 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4wbn8_416c30d9-5442-418f-a668-fcca8c4804a2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.742028 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76f77cb758-wwjbp_0eea0f4a-eab7-4dae-844b-f614654cd6d4/keystone-api/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.751493 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565901-g94sr_88b1db62-9d2d-4d6b-8a37-d45f1feb2319/keystone-cron/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.893906 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_50808de7-9788-451a-8910-6be8f217ae09/kube-state-metrics/0.log" Mar 19 21:10:54 crc kubenswrapper[4799]: I0319 21:10:54.998584 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-g5tb6_880fa93c-771b-4896-9ab3-cb7d0a9ab3e0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:55 crc kubenswrapper[4799]: I0319 21:10:55.359992 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77768f5c85-6lgxw_f2d71d6b-73c6-4edf-8ef2-5295c628603c/neutron-api/0.log" Mar 19 21:10:55 crc kubenswrapper[4799]: I0319 21:10:55.383828 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77768f5c85-6lgxw_f2d71d6b-73c6-4edf-8ef2-5295c628603c/neutron-httpd/0.log" Mar 19 21:10:55 crc kubenswrapper[4799]: I0319 21:10:55.571007 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dfck6_28b384cd-034d-407f-a7f6-0b1b0ddffb4f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.156871 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d8aa3e34-ca13-45cd-960f-2973a80c80b8/nova-cell0-conductor-conductor/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.181317 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f822edd3-bebb-4a8a-9755-60419527dbde/nova-api-log/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.542306 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_df1146cf-6649-441e-b17f-bfbebbdc3439/nova-cell1-conductor-conductor/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.556783 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e45143c6-a88e-40c6-a6fc-5452da3be735/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.635849 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f822edd3-bebb-4a8a-9755-60419527dbde/nova-api-api/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.782672 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5x7fx_0317cfee-27aa-4ba1-9c6a-cf2b368c811b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:56 crc kubenswrapper[4799]: I0319 21:10:56.935370 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a1b0148b-b48b-44b2-9fee-1fd4389fbf77/nova-metadata-log/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.231130 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2add32a8-30b6-4000-bdd7-c96cda2bb599/nova-scheduler-scheduler/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.254683 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ad66907-e766-4e25-9e0c-03e2a0a803e6/mysql-bootstrap/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.314355 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a1b0148b-b48b-44b2-9fee-1fd4389fbf77/nova-metadata-metadata/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.447354 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ad66907-e766-4e25-9e0c-03e2a0a803e6/mysql-bootstrap/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.525547 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ad66907-e766-4e25-9e0c-03e2a0a803e6/galera/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.538636 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5b6d87e-0486-4c0c-9578-514626ca7579/mysql-bootstrap/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.731752 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5b6d87e-0486-4c0c-9578-514626ca7579/mysql-bootstrap/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.737799 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5b6d87e-0486-4c0c-9578-514626ca7579/galera/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.771736 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_575c8839-3cdb-4137-967a-3544c626113f/openstackclient/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.962986 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q7s8k_ada5da2f-892a-4d4b-a18c-d641456e9124/openstack-network-exporter/0.log" Mar 19 21:10:57 crc kubenswrapper[4799]: I0319 21:10:57.967046 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-h89k2_cef0dc98-f20b-4f5a-b0f7-7d121e2b5fd8/ovn-controller/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.170147 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovsdb-server-init/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.350246 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovs-vswitchd/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.393435 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovsdb-server-init/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.403515 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dcj2n_279eab1e-b756-4a2c-be19-2b16d87d645c/ovsdb-server/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.591322 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c9nbt_b526ddd9-685d-42a9-8598-d5dd7710942a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.605503 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_df45d4ef-7350-42d7-a2d7-cede9b13ff55/openstack-network-exporter/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.755315 4799 patch_prober.go:28] interesting pod/machine-config-daemon-mv84p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.755377 4799 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.755437 4799 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.756415 4799 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8"} pod="openshift-machine-config-operator/machine-config-daemon-mv84p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.756477 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerName="machine-config-daemon" containerID="cri-o://f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" gracePeriod=600 Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.773562 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_df45d4ef-7350-42d7-a2d7-cede9b13ff55/ovn-northd/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.857902 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a3d93760-46c9-46e0-aff9-38ad08bad16b/openstack-network-exporter/0.log" Mar 19 21:10:58 crc kubenswrapper[4799]: E0319 21:10:58.875956 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:10:58 crc kubenswrapper[4799]: I0319 21:10:58.887632 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a3d93760-46c9-46e0-aff9-38ad08bad16b/ovsdbserver-nb/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.052859 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_379ddd3c-6b22-4ccd-90a0-8c1cce4a572b/openstack-network-exporter/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.089820 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_379ddd3c-6b22-4ccd-90a0-8c1cce4a572b/ovsdbserver-sb/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.276892 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-686489978d-5lwnf_55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3/placement-api/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.360189 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-686489978d-5lwnf_55bf4b6f-6b3e-496d-8e5b-2be9d9fbcbc3/placement-log/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.433702 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1b1c5d7a-7501-4c34-9823-c996a2413399/setup-container/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.554428 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1b1c5d7a-7501-4c34-9823-c996a2413399/setup-container/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.619905 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1b1c5d7a-7501-4c34-9823-c996a2413399/rabbitmq/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.732627 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3117828b-97c2-41b6-a48d-cf7154e2bb71/setup-container/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.811827 4799 generic.go:334] "Generic (PLEG): container finished" podID="cf986000-80c1-4cf1-8648-d2f7ee370e88" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" exitCode=0 Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.811875 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerDied","Data":"f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8"} Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.811943 4799 scope.go:117] "RemoveContainer" containerID="2b31612692af021bb5f91c7f48d51540f477275a528f51f53a0921d879975895" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.812571 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:10:59 crc kubenswrapper[4799]: E0319 21:10:59.813036 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.879496 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3117828b-97c2-41b6-a48d-cf7154e2bb71/rabbitmq/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.918462 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3117828b-97c2-41b6-a48d-cf7154e2bb71/setup-container/0.log" Mar 19 21:10:59 crc kubenswrapper[4799]: I0319 21:10:59.936011 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-klmgh_d20cbc69-15fd-45a3-95f9-d29078eb55c7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.091056 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ngr7f_27e3e050-d2b1-4bc2-b93c-4258f8f4a86d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.213119 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5fmck_6e6ab146-318d-4a9d-860b-8c1d5c12fab9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.296649 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-cxss6_2c78b6c2-a079-427d-9ebe-f8250777e6bd/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.392889 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2jrjr_c2e0b37a-955c-4332-bae2-6a7ffd2712f4/ssh-known-hosts-edpm-deployment/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.685745 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75d564c56c-vzn6z_ed03c750-cae7-4181-8451-c88d57969c01/proxy-server/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.707175 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75d564c56c-vzn6z_ed03c750-cae7-4181-8451-c88d57969c01/proxy-httpd/0.log" Mar 19 21:11:00 crc kubenswrapper[4799]: I0319 21:11:00.816011 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-stws7_1f8bd39c-7709-4713-b7f6-9713873dae5b/swift-ring-rebalance/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.129092 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-auditor/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.139879 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-reaper/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.224056 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-replicator/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.306103 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/account-server/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.337150 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-auditor/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.378118 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-replicator/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.477082 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-server/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.526916 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/container-updater/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.571622 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-auditor/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.649932 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-expirer/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.727222 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-replicator/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.791927 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-server/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.839960 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/object-updater/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.884222 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/rsync/0.log" Mar 19 21:11:01 crc kubenswrapper[4799]: I0319 21:11:01.956425 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa0c0465-9e46-41e7-88b3-07a6da9cd6c7/swift-recon-cron/0.log" Mar 19 21:11:02 crc kubenswrapper[4799]: I0319 21:11:02.105736 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-j27sk_d77bbb41-e2be-4c81-b266-92ca0b6e8b44/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:11:02 crc kubenswrapper[4799]: I0319 21:11:02.215856 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66572ad6-a9d4-4dc7-ae3e-61a1d67d928a/tempest-tests-tempest-tests-runner/0.log" Mar 19 21:11:02 crc kubenswrapper[4799]: I0319 21:11:02.338606 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e3586638-0807-4ea9-9027-0b953c5ea3cb/test-operator-logs-container/0.log" Mar 19 21:11:02 crc kubenswrapper[4799]: I0319 21:11:02.403643 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gjjlz_4bb02004-3780-40e2-9e05-93d1791c0c16/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 21:11:10 crc kubenswrapper[4799]: I0319 21:11:10.438091 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d881c32e-3c0c-415a-aa56-6e70a316b015/memcached/0.log" Mar 19 21:11:15 crc kubenswrapper[4799]: I0319 21:11:15.116713 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:11:15 crc kubenswrapper[4799]: E0319 21:11:15.117438 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:11:28 crc kubenswrapper[4799]: I0319 21:11:28.116990 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:11:28 crc kubenswrapper[4799]: E0319 21:11:28.117839 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:11:29 crc kubenswrapper[4799]: I0319 21:11:29.852106 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/util/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.061969 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/pull/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.066670 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/pull/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.082561 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/util/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.226573 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/extract/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.234978 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/pull/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.237463 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c78hwq_21ee74fa-1ec7-4b43-9ca7-5b7535f7a8c3/util/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.477020 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-jphcf_f5a6c547-0da9-4313-817a-9562fa9cb775/manager/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.708593 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-6j7gd_8f841081-d1d8-464a-ae77-af76f0a109ea/manager/0.log" Mar 19 21:11:30 crc kubenswrapper[4799]: I0319 21:11:30.972453 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-d5sq6_3e5cf32e-9b90-4518-86bb-5237dbf97e55/manager/0.log" Mar 19 21:11:31 crc kubenswrapper[4799]: I0319 21:11:31.017072 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-f7g26_1e9b69cc-5dc0-400e-9894-7ff0b173e6cb/manager/0.log" Mar 19 21:11:31 crc kubenswrapper[4799]: I0319 21:11:31.450704 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-kswht_1984ed7f-dd4a-43b7-b724-a902bccf7448/manager/0.log" Mar 19 21:11:31 crc kubenswrapper[4799]: I0319 21:11:31.700927 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-rrp47_23a48119-c751-435f-9ed5-5a4b0dcf7ae0/manager/0.log" Mar 19 21:11:31 crc kubenswrapper[4799]: I0319 21:11:31.807653 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-xrrtz_fa0d513e-9d5f-4d00-9ac2-664a4b3a05ce/manager/0.log" Mar 19 21:11:31 crc kubenswrapper[4799]: I0319 21:11:31.874967 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-cnhvk_771a024f-6f6e-43f1-82cb-076a70663c36/manager/0.log" Mar 19 21:11:31 crc kubenswrapper[4799]: I0319 21:11:31.964046 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-ct2xj_52430659-ee8f-4143-a0cc-554487c4ee41/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.041137 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-rhbks_310bb10d-00e4-4135-826e-43f7ca17bdf1/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.174037 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-j6bcm_a0891d87-bed5-4b7b-bab9-653866be0678/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.308328 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-jqm8n_5dffef78-a7d7-400c-8e1c-80fd01df4f07/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.441590 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5s2dz_b1950b27-4b38-4bd5-b858-fcf5aa82d7fd/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.500500 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6wvnr_72be7e05-f329-4420-9632-3f6827c4e0e9/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.655627 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-74c4796899hpqp2_8fe20f94-3898-4826-8f94-a97f5d7619d6/manager/0.log" Mar 19 21:11:32 crc kubenswrapper[4799]: I0319 21:11:32.776446 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b85c4d696-47f2t_4fdcc365-1a41-47ce-8988-24a55f0bb8ac/operator/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.022406 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zw5gb_d4a4d93a-af9b-49a6-8786-34f07a5a4ba4/registry-server/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.246370 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-gjj5c_879a3d02-050b-44dc-95ff-f4a010fe4739/manager/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.288910 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-4z6tx_9ac71989-25c8-4255-8612-a9f736ab50a1/manager/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.552281 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2scch_f0f7e5d2-9450-4672-aa4a-afb3f67f3b6b/operator/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.784173 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-jrtt2_85dcf72e-1669-4316-afe2-2dbf9059cd35/manager/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.856538 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-dhfqp_84e19238-e730-4bcf-9e09-1c6f3421a04d/manager/0.log" Mar 19 21:11:33 crc kubenswrapper[4799]: I0319 21:11:33.918201 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86bd8996f6-bbbxq_d0525132-b508-40b7-a9eb-4773cfde1c32/manager/0.log" Mar 19 21:11:34 crc kubenswrapper[4799]: I0319 21:11:34.082683 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-4v4kl_3cac2ce5-3a90-4588-88ba-11557915c62c/manager/0.log" Mar 19 21:11:34 crc kubenswrapper[4799]: I0319 21:11:34.107346 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-vp4rw_503586c4-3015-43d9-bb6e-56bef997c641/manager/0.log" Mar 19 21:11:42 crc kubenswrapper[4799]: I0319 21:11:42.116093 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:11:42 crc kubenswrapper[4799]: E0319 21:11:42.116827 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:11:54 crc kubenswrapper[4799]: I0319 21:11:54.116770 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:11:54 crc kubenswrapper[4799]: E0319 21:11:54.117357 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:11:55 crc kubenswrapper[4799]: I0319 21:11:55.050035 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-drzzt_8dd97bd4-11f1-48a2-ba74-2eba33de161b/control-plane-machine-set-operator/0.log" Mar 19 21:11:55 crc kubenswrapper[4799]: I0319 21:11:55.219550 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6ddkj_0b71b47b-d667-49c1-ae5b-3326bcde5508/machine-api-operator/0.log" Mar 19 21:11:55 crc kubenswrapper[4799]: I0319 21:11:55.246222 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6ddkj_0b71b47b-d667-49c1-ae5b-3326bcde5508/kube-rbac-proxy/0.log" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.154805 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565912-7k9l5"] Mar 19 21:12:00 crc kubenswrapper[4799]: E0319 21:12:00.155893 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e084bf-4096-4d33-90f7-2e8c8d02c223" containerName="container-00" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.155911 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e084bf-4096-4d33-90f7-2e8c8d02c223" containerName="container-00" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.156184 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e084bf-4096-4d33-90f7-2e8c8d02c223" containerName="container-00" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.157022 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.159109 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.160594 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.160944 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.167370 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565912-7k9l5"] Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.257654 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdj99\" (UniqueName: \"kubernetes.io/projected/6cc4e3c6-6841-469d-a4e0-1c2aa744dc67-kube-api-access-qdj99\") pod \"auto-csr-approver-29565912-7k9l5\" (UID: \"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67\") " pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.359437 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdj99\" (UniqueName: \"kubernetes.io/projected/6cc4e3c6-6841-469d-a4e0-1c2aa744dc67-kube-api-access-qdj99\") pod \"auto-csr-approver-29565912-7k9l5\" (UID: \"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67\") " pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.384261 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdj99\" (UniqueName: \"kubernetes.io/projected/6cc4e3c6-6841-469d-a4e0-1c2aa744dc67-kube-api-access-qdj99\") pod \"auto-csr-approver-29565912-7k9l5\" (UID: \"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67\") " pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.473697 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.984500 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565912-7k9l5"] Mar 19 21:12:00 crc kubenswrapper[4799]: I0319 21:12:00.996480 4799 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 21:12:01 crc kubenswrapper[4799]: I0319 21:12:01.403634 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" event={"ID":"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67","Type":"ContainerStarted","Data":"8fef682982deec86001f9b3477d5dc2fc57c7ca8efa5c03f7141e8f520826fcf"} Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.425846 4799 generic.go:334] "Generic (PLEG): container finished" podID="6cc4e3c6-6841-469d-a4e0-1c2aa744dc67" containerID="5b7d7b126fa468f9fdf84452ec528d856791a52b32563a11f564bce6d7477ad8" exitCode=0 Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.426542 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" event={"ID":"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67","Type":"ContainerDied","Data":"5b7d7b126fa468f9fdf84452ec528d856791a52b32563a11f564bce6d7477ad8"} Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.431644 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zn9dq"] Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.436224 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.443282 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn9dq"] Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.520278 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-catalog-content\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.520672 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7hrv\" (UniqueName: \"kubernetes.io/projected/b7037d2f-c6e2-451c-8464-825b72736f58-kube-api-access-f7hrv\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.520732 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-utilities\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.622983 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-catalog-content\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.623070 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7hrv\" (UniqueName: \"kubernetes.io/projected/b7037d2f-c6e2-451c-8464-825b72736f58-kube-api-access-f7hrv\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.623123 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-utilities\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.623543 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-catalog-content\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.623652 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-utilities\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.652860 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7hrv\" (UniqueName: \"kubernetes.io/projected/b7037d2f-c6e2-451c-8464-825b72736f58-kube-api-access-f7hrv\") pod \"redhat-operators-zn9dq\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:03 crc kubenswrapper[4799]: I0319 21:12:03.781123 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:04 crc kubenswrapper[4799]: I0319 21:12:04.259375 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn9dq"] Mar 19 21:12:04 crc kubenswrapper[4799]: W0319 21:12:04.275739 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7037d2f_c6e2_451c_8464_825b72736f58.slice/crio-fb15845b58d12f8ec25b0b270b40eb89ca687daad2b47a16e9ee7707eadff17f WatchSource:0}: Error finding container fb15845b58d12f8ec25b0b270b40eb89ca687daad2b47a16e9ee7707eadff17f: Status 404 returned error can't find the container with id fb15845b58d12f8ec25b0b270b40eb89ca687daad2b47a16e9ee7707eadff17f Mar 19 21:12:04 crc kubenswrapper[4799]: I0319 21:12:04.436244 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerStarted","Data":"fb15845b58d12f8ec25b0b270b40eb89ca687daad2b47a16e9ee7707eadff17f"} Mar 19 21:12:04 crc kubenswrapper[4799]: I0319 21:12:04.796993 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:04 crc kubenswrapper[4799]: I0319 21:12:04.840626 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdj99\" (UniqueName: \"kubernetes.io/projected/6cc4e3c6-6841-469d-a4e0-1c2aa744dc67-kube-api-access-qdj99\") pod \"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67\" (UID: \"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67\") " Mar 19 21:12:04 crc kubenswrapper[4799]: I0319 21:12:04.846158 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc4e3c6-6841-469d-a4e0-1c2aa744dc67-kube-api-access-qdj99" (OuterVolumeSpecName: "kube-api-access-qdj99") pod "6cc4e3c6-6841-469d-a4e0-1c2aa744dc67" (UID: "6cc4e3c6-6841-469d-a4e0-1c2aa744dc67"). InnerVolumeSpecName "kube-api-access-qdj99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:12:04 crc kubenswrapper[4799]: I0319 21:12:04.943297 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdj99\" (UniqueName: \"kubernetes.io/projected/6cc4e3c6-6841-469d-a4e0-1c2aa744dc67-kube-api-access-qdj99\") on node \"crc\" DevicePath \"\"" Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.448009 4799 generic.go:334] "Generic (PLEG): container finished" podID="b7037d2f-c6e2-451c-8464-825b72736f58" containerID="eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45" exitCode=0 Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.448085 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerDied","Data":"eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45"} Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.452943 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" event={"ID":"6cc4e3c6-6841-469d-a4e0-1c2aa744dc67","Type":"ContainerDied","Data":"8fef682982deec86001f9b3477d5dc2fc57c7ca8efa5c03f7141e8f520826fcf"} Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.452975 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fef682982deec86001f9b3477d5dc2fc57c7ca8efa5c03f7141e8f520826fcf" Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.453089 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565912-7k9l5" Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.865676 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565906-nmp4d"] Mar 19 21:12:05 crc kubenswrapper[4799]: I0319 21:12:05.874310 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565906-nmp4d"] Mar 19 21:12:06 crc kubenswrapper[4799]: I0319 21:12:06.466661 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerStarted","Data":"e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4"} Mar 19 21:12:07 crc kubenswrapper[4799]: I0319 21:12:07.116199 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:12:07 crc kubenswrapper[4799]: E0319 21:12:07.116830 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:12:07 crc kubenswrapper[4799]: I0319 21:12:07.134285 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5babc98b-61ae-4dc4-a213-b01a732a7a1a" path="/var/lib/kubelet/pods/5babc98b-61ae-4dc4-a213-b01a732a7a1a/volumes" Mar 19 21:12:09 crc kubenswrapper[4799]: I0319 21:12:09.494469 4799 generic.go:334] "Generic (PLEG): container finished" podID="b7037d2f-c6e2-451c-8464-825b72736f58" containerID="e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4" exitCode=0 Mar 19 21:12:09 crc kubenswrapper[4799]: I0319 21:12:09.494539 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerDied","Data":"e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4"} Mar 19 21:12:09 crc kubenswrapper[4799]: I0319 21:12:09.531268 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fv54b_5f8bca57-6368-4bd6-9d79-d0e640dd074f/cert-manager-controller/0.log" Mar 19 21:12:09 crc kubenswrapper[4799]: I0319 21:12:09.603965 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-9ppqb_a471aa18-d5fa-455b-b8a6-395717db50b9/cert-manager-cainjector/0.log" Mar 19 21:12:09 crc kubenswrapper[4799]: I0319 21:12:09.714183 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-jtk48_570da1bb-c2ff-40e5-a2b6-352d09168d6d/cert-manager-webhook/0.log" Mar 19 21:12:11 crc kubenswrapper[4799]: I0319 21:12:11.528826 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerStarted","Data":"975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643"} Mar 19 21:12:13 crc kubenswrapper[4799]: I0319 21:12:13.781892 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:13 crc kubenswrapper[4799]: I0319 21:12:13.783802 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:14 crc kubenswrapper[4799]: I0319 21:12:14.832264 4799 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zn9dq" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="registry-server" probeResult="failure" output=< Mar 19 21:12:14 crc kubenswrapper[4799]: timeout: failed to connect service ":50051" within 1s Mar 19 21:12:14 crc kubenswrapper[4799]: > Mar 19 21:12:20 crc kubenswrapper[4799]: I0319 21:12:20.116331 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:12:20 crc kubenswrapper[4799]: E0319 21:12:20.118034 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:12:22 crc kubenswrapper[4799]: I0319 21:12:22.698677 4799 scope.go:117] "RemoveContainer" containerID="c697b47b06a97066b2cdb12e98f73f6b5beced493308146ac5cbaa7024c6bc91" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.380050 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-9s6cx_540372f2-ca9d-47b0-aaa6-86831627cd8e/nmstate-console-plugin/0.log" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.568096 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qqjcl_7c8d90dd-d173-4fd7-a3ae-ed312bc20861/nmstate-handler/0.log" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.614958 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qzf64_dab69c67-b7fc-4f89-93c6-6ee825d89b7d/kube-rbac-proxy/0.log" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.690195 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qzf64_dab69c67-b7fc-4f89-93c6-6ee825d89b7d/nmstate-metrics/0.log" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.746030 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-29txn_44b60556-07ce-4245-a994-dded304e075b/nmstate-operator/0.log" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.830849 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.857159 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zn9dq" podStartSLOduration=15.708310111 podStartE2EDuration="20.857143043s" podCreationTimestamp="2026-03-19 21:12:03 +0000 UTC" firstStartedPulling="2026-03-19 21:12:05.450233349 +0000 UTC m=+4003.056186431" lastFinishedPulling="2026-03-19 21:12:10.599066291 +0000 UTC m=+4008.205019363" observedRunningTime="2026-03-19 21:12:11.551527243 +0000 UTC m=+4009.157480335" watchObservedRunningTime="2026-03-19 21:12:23.857143043 +0000 UTC m=+4021.463096105" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.869110 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-psznr_dff785fc-6dbc-40bf-a5b3-d950ed4cb6e6/nmstate-webhook/0.log" Mar 19 21:12:23 crc kubenswrapper[4799]: I0319 21:12:23.875416 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:24 crc kubenswrapper[4799]: I0319 21:12:24.074366 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn9dq"] Mar 19 21:12:25 crc kubenswrapper[4799]: I0319 21:12:25.659499 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zn9dq" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="registry-server" containerID="cri-o://975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643" gracePeriod=2 Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.138549 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.252003 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-catalog-content\") pod \"b7037d2f-c6e2-451c-8464-825b72736f58\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.252529 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-utilities\") pod \"b7037d2f-c6e2-451c-8464-825b72736f58\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.252643 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7hrv\" (UniqueName: \"kubernetes.io/projected/b7037d2f-c6e2-451c-8464-825b72736f58-kube-api-access-f7hrv\") pod \"b7037d2f-c6e2-451c-8464-825b72736f58\" (UID: \"b7037d2f-c6e2-451c-8464-825b72736f58\") " Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.253803 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-utilities" (OuterVolumeSpecName: "utilities") pod "b7037d2f-c6e2-451c-8464-825b72736f58" (UID: "b7037d2f-c6e2-451c-8464-825b72736f58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.260779 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7037d2f-c6e2-451c-8464-825b72736f58-kube-api-access-f7hrv" (OuterVolumeSpecName: "kube-api-access-f7hrv") pod "b7037d2f-c6e2-451c-8464-825b72736f58" (UID: "b7037d2f-c6e2-451c-8464-825b72736f58"). InnerVolumeSpecName "kube-api-access-f7hrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.354695 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.354761 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7hrv\" (UniqueName: \"kubernetes.io/projected/b7037d2f-c6e2-451c-8464-825b72736f58-kube-api-access-f7hrv\") on node \"crc\" DevicePath \"\"" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.382685 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7037d2f-c6e2-451c-8464-825b72736f58" (UID: "b7037d2f-c6e2-451c-8464-825b72736f58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.457291 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7037d2f-c6e2-451c-8464-825b72736f58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.674154 4799 generic.go:334] "Generic (PLEG): container finished" podID="b7037d2f-c6e2-451c-8464-825b72736f58" containerID="975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643" exitCode=0 Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.674212 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerDied","Data":"975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643"} Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.674249 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn9dq" event={"ID":"b7037d2f-c6e2-451c-8464-825b72736f58","Type":"ContainerDied","Data":"fb15845b58d12f8ec25b0b270b40eb89ca687daad2b47a16e9ee7707eadff17f"} Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.674277 4799 scope.go:117] "RemoveContainer" containerID="975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.674519 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn9dq" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.722043 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn9dq"] Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.726449 4799 scope.go:117] "RemoveContainer" containerID="e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.731814 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zn9dq"] Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.753553 4799 scope.go:117] "RemoveContainer" containerID="eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.803978 4799 scope.go:117] "RemoveContainer" containerID="975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643" Mar 19 21:12:26 crc kubenswrapper[4799]: E0319 21:12:26.804493 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643\": container with ID starting with 975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643 not found: ID does not exist" containerID="975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.804542 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643"} err="failed to get container status \"975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643\": rpc error: code = NotFound desc = could not find container \"975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643\": container with ID starting with 975d551b795ca9d9c25d1dbf6a926fadcb7e137d901ec2b6d2cc22520ed24643 not found: ID does not exist" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.804570 4799 scope.go:117] "RemoveContainer" containerID="e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4" Mar 19 21:12:26 crc kubenswrapper[4799]: E0319 21:12:26.804885 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4\": container with ID starting with e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4 not found: ID does not exist" containerID="e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.804928 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4"} err="failed to get container status \"e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4\": rpc error: code = NotFound desc = could not find container \"e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4\": container with ID starting with e5c87dcaf65542dae11b9b944138b2a3dedc208fcfc500cfa114e56a21c48aa4 not found: ID does not exist" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.804954 4799 scope.go:117] "RemoveContainer" containerID="eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45" Mar 19 21:12:26 crc kubenswrapper[4799]: E0319 21:12:26.805224 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45\": container with ID starting with eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45 not found: ID does not exist" containerID="eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45" Mar 19 21:12:26 crc kubenswrapper[4799]: I0319 21:12:26.805250 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45"} err="failed to get container status \"eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45\": rpc error: code = NotFound desc = could not find container \"eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45\": container with ID starting with eb04b2f7b1847af0a4b2498b98d94f197521c1abffdab47809b8d71bfced1a45 not found: ID does not exist" Mar 19 21:12:27 crc kubenswrapper[4799]: I0319 21:12:27.126556 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" path="/var/lib/kubelet/pods/b7037d2f-c6e2-451c-8464-825b72736f58/volumes" Mar 19 21:12:31 crc kubenswrapper[4799]: I0319 21:12:31.117105 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:12:31 crc kubenswrapper[4799]: E0319 21:12:31.118569 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:12:43 crc kubenswrapper[4799]: I0319 21:12:43.131167 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:12:43 crc kubenswrapper[4799]: E0319 21:12:43.132044 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.176498 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pw9bq"] Mar 19 21:12:45 crc kubenswrapper[4799]: E0319 21:12:45.177338 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="registry-server" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.177355 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="registry-server" Mar 19 21:12:45 crc kubenswrapper[4799]: E0319 21:12:45.177415 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc4e3c6-6841-469d-a4e0-1c2aa744dc67" containerName="oc" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.177425 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc4e3c6-6841-469d-a4e0-1c2aa744dc67" containerName="oc" Mar 19 21:12:45 crc kubenswrapper[4799]: E0319 21:12:45.177457 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="extract-utilities" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.177466 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="extract-utilities" Mar 19 21:12:45 crc kubenswrapper[4799]: E0319 21:12:45.177478 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="extract-content" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.177485 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="extract-content" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.177703 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc4e3c6-6841-469d-a4e0-1c2aa744dc67" containerName="oc" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.177727 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7037d2f-c6e2-451c-8464-825b72736f58" containerName="registry-server" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.179401 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.205209 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pw9bq"] Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.328748 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-utilities\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.328946 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59q27\" (UniqueName: \"kubernetes.io/projected/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-kube-api-access-59q27\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.329104 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-catalog-content\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.431236 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-utilities\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.431399 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59q27\" (UniqueName: \"kubernetes.io/projected/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-kube-api-access-59q27\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.431466 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-catalog-content\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.432019 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-catalog-content\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.432237 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-utilities\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.464482 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59q27\" (UniqueName: \"kubernetes.io/projected/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-kube-api-access-59q27\") pod \"community-operators-pw9bq\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.496695 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.773173 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pw9bq"] Mar 19 21:12:45 crc kubenswrapper[4799]: I0319 21:12:45.877903 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerStarted","Data":"c9182953f9b0fb55fc5bbf68af1a7bf3917c82c7565e08eb9ecc4f1f1e5f6b14"} Mar 19 21:12:46 crc kubenswrapper[4799]: I0319 21:12:46.890908 4799 generic.go:334] "Generic (PLEG): container finished" podID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerID="43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501" exitCode=0 Mar 19 21:12:46 crc kubenswrapper[4799]: I0319 21:12:46.891016 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerDied","Data":"43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501"} Mar 19 21:12:47 crc kubenswrapper[4799]: I0319 21:12:47.900959 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerStarted","Data":"2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9"} Mar 19 21:12:49 crc kubenswrapper[4799]: I0319 21:12:49.920619 4799 generic.go:334] "Generic (PLEG): container finished" podID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerID="2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9" exitCode=0 Mar 19 21:12:49 crc kubenswrapper[4799]: I0319 21:12:49.920694 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerDied","Data":"2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9"} Mar 19 21:12:50 crc kubenswrapper[4799]: I0319 21:12:50.932570 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerStarted","Data":"265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1"} Mar 19 21:12:50 crc kubenswrapper[4799]: I0319 21:12:50.961253 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pw9bq" podStartSLOduration=2.511600117 podStartE2EDuration="5.961232594s" podCreationTimestamp="2026-03-19 21:12:45 +0000 UTC" firstStartedPulling="2026-03-19 21:12:46.894357413 +0000 UTC m=+4044.500310515" lastFinishedPulling="2026-03-19 21:12:50.34398991 +0000 UTC m=+4047.949942992" observedRunningTime="2026-03-19 21:12:50.951732065 +0000 UTC m=+4048.557685137" watchObservedRunningTime="2026-03-19 21:12:50.961232594 +0000 UTC m=+4048.567185666" Mar 19 21:12:53 crc kubenswrapper[4799]: I0319 21:12:53.840840 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-9nd8c_8a6fd137-8e20-4043-b746-7d4b884ffc5a/kube-rbac-proxy/0.log" Mar 19 21:12:53 crc kubenswrapper[4799]: I0319 21:12:53.882981 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-9nd8c_8a6fd137-8e20-4043-b746-7d4b884ffc5a/controller/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.003777 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.116356 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:12:54 crc kubenswrapper[4799]: E0319 21:12:54.116886 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.168979 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.193545 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.196874 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.200738 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.358814 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.364797 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.372980 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.385750 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.601940 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-frr-files/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.639920 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/controller/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.641637 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-reloader/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.654875 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/cp-metrics/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.809845 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/frr-metrics/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.835446 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/kube-rbac-proxy-frr/0.log" Mar 19 21:12:54 crc kubenswrapper[4799]: I0319 21:12:54.873053 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/kube-rbac-proxy/0.log" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.059924 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/reloader/0.log" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.074303 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-2gv7v_b5cbaa72-f84a-4672-bb65-f67e4cf5ac5a/frr-k8s-webhook-server/0.log" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.348630 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5578d7df77-xlzz9_993c9a96-b852-40c4-87e6-02e706b89b25/manager/0.log" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.497204 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.497500 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.551738 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59fdf54f4b-tp45h_0ec1faac-95e3-4189-bbfd-acc0f4662787/webhook-server/0.log" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.563779 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:55 crc kubenswrapper[4799]: I0319 21:12:55.571244 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m865t_f71d2873-ac06-4cca-b70c-162e283e23b8/kube-rbac-proxy/0.log" Mar 19 21:12:56 crc kubenswrapper[4799]: I0319 21:12:56.016903 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:56 crc kubenswrapper[4799]: I0319 21:12:56.172189 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m865t_f71d2873-ac06-4cca-b70c-162e283e23b8/speaker/0.log" Mar 19 21:12:56 crc kubenswrapper[4799]: I0319 21:12:56.376637 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m52p2_66cf30af-75f2-49da-a9de-cb266154b446/frr/0.log" Mar 19 21:12:58 crc kubenswrapper[4799]: I0319 21:12:58.046312 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pw9bq"] Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.000308 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pw9bq" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="registry-server" containerID="cri-o://265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1" gracePeriod=2 Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.525356 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.580411 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-utilities\") pod \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.580562 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-catalog-content\") pod \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.580603 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59q27\" (UniqueName: \"kubernetes.io/projected/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-kube-api-access-59q27\") pod \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\" (UID: \"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559\") " Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.581672 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-utilities" (OuterVolumeSpecName: "utilities") pod "398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" (UID: "398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.587247 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-kube-api-access-59q27" (OuterVolumeSpecName: "kube-api-access-59q27") pod "398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" (UID: "398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559"). InnerVolumeSpecName "kube-api-access-59q27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.626569 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" (UID: "398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.682821 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59q27\" (UniqueName: \"kubernetes.io/projected/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-kube-api-access-59q27\") on node \"crc\" DevicePath \"\"" Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.682856 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:12:59 crc kubenswrapper[4799]: I0319 21:12:59.682870 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.023765 4799 generic.go:334] "Generic (PLEG): container finished" podID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerID="265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1" exitCode=0 Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.024160 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerDied","Data":"265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1"} Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.024238 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pw9bq" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.024269 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pw9bq" event={"ID":"398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559","Type":"ContainerDied","Data":"c9182953f9b0fb55fc5bbf68af1a7bf3917c82c7565e08eb9ecc4f1f1e5f6b14"} Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.024342 4799 scope.go:117] "RemoveContainer" containerID="265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.064337 4799 scope.go:117] "RemoveContainer" containerID="2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.083611 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pw9bq"] Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.087414 4799 scope.go:117] "RemoveContainer" containerID="43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.102617 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pw9bq"] Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.136162 4799 scope.go:117] "RemoveContainer" containerID="265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1" Mar 19 21:13:00 crc kubenswrapper[4799]: E0319 21:13:00.136650 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1\": container with ID starting with 265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1 not found: ID does not exist" containerID="265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.136691 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1"} err="failed to get container status \"265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1\": rpc error: code = NotFound desc = could not find container \"265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1\": container with ID starting with 265d7f6d108416bdf76f347017950cc970613ec0a9bd7c8b36211e718448c0a1 not found: ID does not exist" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.136712 4799 scope.go:117] "RemoveContainer" containerID="2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9" Mar 19 21:13:00 crc kubenswrapper[4799]: E0319 21:13:00.137267 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9\": container with ID starting with 2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9 not found: ID does not exist" containerID="2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.137325 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9"} err="failed to get container status \"2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9\": rpc error: code = NotFound desc = could not find container \"2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9\": container with ID starting with 2c96aaa485075c1645520871537f552f4d4b35dbc55d403152ebcc1c6e9d19d9 not found: ID does not exist" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.137358 4799 scope.go:117] "RemoveContainer" containerID="43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501" Mar 19 21:13:00 crc kubenswrapper[4799]: E0319 21:13:00.137663 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501\": container with ID starting with 43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501 not found: ID does not exist" containerID="43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501" Mar 19 21:13:00 crc kubenswrapper[4799]: I0319 21:13:00.137697 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501"} err="failed to get container status \"43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501\": rpc error: code = NotFound desc = could not find container \"43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501\": container with ID starting with 43d9525b88d2909813b17e109997d201fb8e69aec3718bc8254519ab01c0e501 not found: ID does not exist" Mar 19 21:13:01 crc kubenswrapper[4799]: I0319 21:13:01.136633 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" path="/var/lib/kubelet/pods/398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559/volumes" Mar 19 21:13:08 crc kubenswrapper[4799]: I0319 21:13:08.116694 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:13:08 crc kubenswrapper[4799]: E0319 21:13:08.117943 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:13:11 crc kubenswrapper[4799]: I0319 21:13:11.770637 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/util/0.log" Mar 19 21:13:11 crc kubenswrapper[4799]: I0319 21:13:11.937412 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/util/0.log" Mar 19 21:13:11 crc kubenswrapper[4799]: I0319 21:13:11.979685 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/pull/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.024738 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/pull/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.131909 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/util/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.145235 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/pull/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.206004 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748dn5f_527f6060-14f0-48e5-b8a9-4fc91d1775a6/extract/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.287795 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/util/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.498779 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/pull/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.508583 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/util/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.509105 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/pull/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.652179 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/util/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.659908 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/pull/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.686182 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1jz5pb_416e049b-dc1c-4119-b204-92e1e4f9513c/extract/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.815044 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-utilities/0.log" Mar 19 21:13:12 crc kubenswrapper[4799]: I0319 21:13:12.991576 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-content/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.000257 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-content/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.002642 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-utilities/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.162944 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-utilities/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.188464 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/extract-content/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.368861 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz4lk_f0e26ff5-2c74-45b7-99a4-761342edc10c/registry-server/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.392214 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-utilities/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.540715 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-utilities/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.552877 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-content/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.566722 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-content/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.723338 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-content/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.762022 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/extract-utilities/0.log" Mar 19 21:13:13 crc kubenswrapper[4799]: I0319 21:13:13.929353 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s6hbz_40e9b27a-0c5f-45a3-b424-d1b289b65167/marketplace-operator/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.436167 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-utilities/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.454510 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kxq57_b33abb02-78fb-4afe-af5b-f754a26df60c/registry-server/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.506567 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-utilities/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.586636 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-content/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.596611 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-content/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.783420 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-utilities/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.794198 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/extract-content/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.916360 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2tw9x_dd60eeaa-edc8-447f-866a-f20eefff40c8/registry-server/0.log" Mar 19 21:13:14 crc kubenswrapper[4799]: I0319 21:13:14.976747 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-utilities/0.log" Mar 19 21:13:15 crc kubenswrapper[4799]: I0319 21:13:15.126624 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-content/0.log" Mar 19 21:13:15 crc kubenswrapper[4799]: I0319 21:13:15.129101 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-utilities/0.log" Mar 19 21:13:15 crc kubenswrapper[4799]: I0319 21:13:15.141133 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-content/0.log" Mar 19 21:13:16 crc kubenswrapper[4799]: I0319 21:13:16.197039 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-content/0.log" Mar 19 21:13:16 crc kubenswrapper[4799]: I0319 21:13:16.197352 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/extract-utilities/0.log" Mar 19 21:13:16 crc kubenswrapper[4799]: I0319 21:13:16.799890 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zwqtk_23e7994c-4b3e-4f89-bf49-4166fd9a2a78/registry-server/0.log" Mar 19 21:13:19 crc kubenswrapper[4799]: I0319 21:13:19.116481 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:13:19 crc kubenswrapper[4799]: E0319 21:13:19.117967 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:13:33 crc kubenswrapper[4799]: I0319 21:13:33.127594 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:13:33 crc kubenswrapper[4799]: E0319 21:13:33.128848 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:13:45 crc kubenswrapper[4799]: I0319 21:13:45.115781 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:13:45 crc kubenswrapper[4799]: E0319 21:13:45.116823 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:13:58 crc kubenswrapper[4799]: I0319 21:13:58.116441 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:13:58 crc kubenswrapper[4799]: E0319 21:13:58.117620 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.152054 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565914-pbdq4"] Mar 19 21:14:00 crc kubenswrapper[4799]: E0319 21:14:00.153414 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="registry-server" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.153432 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="registry-server" Mar 19 21:14:00 crc kubenswrapper[4799]: E0319 21:14:00.153455 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="extract-content" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.153464 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="extract-content" Mar 19 21:14:00 crc kubenswrapper[4799]: E0319 21:14:00.153507 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="extract-utilities" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.153518 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="extract-utilities" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.153944 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="398ce0a9-3ca4-4a3a-b2cc-a7569a7bd559" containerName="registry-server" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.154862 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.157784 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.158227 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.158469 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.179350 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565914-pbdq4"] Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.346486 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj69l\" (UniqueName: \"kubernetes.io/projected/0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5-kube-api-access-kj69l\") pod \"auto-csr-approver-29565914-pbdq4\" (UID: \"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5\") " pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:00 crc kubenswrapper[4799]: I0319 21:14:00.449841 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj69l\" (UniqueName: \"kubernetes.io/projected/0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5-kube-api-access-kj69l\") pod \"auto-csr-approver-29565914-pbdq4\" (UID: \"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5\") " pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:01 crc kubenswrapper[4799]: I0319 21:14:01.281481 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj69l\" (UniqueName: \"kubernetes.io/projected/0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5-kube-api-access-kj69l\") pod \"auto-csr-approver-29565914-pbdq4\" (UID: \"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5\") " pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:01 crc kubenswrapper[4799]: I0319 21:14:01.382180 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:01 crc kubenswrapper[4799]: I0319 21:14:01.856432 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565914-pbdq4"] Mar 19 21:14:01 crc kubenswrapper[4799]: W0319 21:14:01.864883 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0072d17e_1ed9_4ab2_b0ae_fd33ed2efff5.slice/crio-bb278fa18fc0ddb7f51b4af2064392f626ccd315e57ef479bbc86d15e831b219 WatchSource:0}: Error finding container bb278fa18fc0ddb7f51b4af2064392f626ccd315e57ef479bbc86d15e831b219: Status 404 returned error can't find the container with id bb278fa18fc0ddb7f51b4af2064392f626ccd315e57ef479bbc86d15e831b219 Mar 19 21:14:02 crc kubenswrapper[4799]: I0319 21:14:02.697021 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" event={"ID":"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5","Type":"ContainerStarted","Data":"bb278fa18fc0ddb7f51b4af2064392f626ccd315e57ef479bbc86d15e831b219"} Mar 19 21:14:03 crc kubenswrapper[4799]: I0319 21:14:03.716503 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" event={"ID":"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5","Type":"ContainerStarted","Data":"fc5747838123bfc75426ff9629dc8f1b0124fb2c655d1fcd825880fb14ca938d"} Mar 19 21:14:03 crc kubenswrapper[4799]: I0319 21:14:03.740638 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" podStartSLOduration=2.748483437 podStartE2EDuration="3.74061632s" podCreationTimestamp="2026-03-19 21:14:00 +0000 UTC" firstStartedPulling="2026-03-19 21:14:01.869048107 +0000 UTC m=+4119.475001179" lastFinishedPulling="2026-03-19 21:14:02.86118095 +0000 UTC m=+4120.467134062" observedRunningTime="2026-03-19 21:14:03.733918141 +0000 UTC m=+4121.339871213" watchObservedRunningTime="2026-03-19 21:14:03.74061632 +0000 UTC m=+4121.346569402" Mar 19 21:14:04 crc kubenswrapper[4799]: I0319 21:14:04.727628 4799 generic.go:334] "Generic (PLEG): container finished" podID="0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5" containerID="fc5747838123bfc75426ff9629dc8f1b0124fb2c655d1fcd825880fb14ca938d" exitCode=0 Mar 19 21:14:04 crc kubenswrapper[4799]: I0319 21:14:04.727756 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" event={"ID":"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5","Type":"ContainerDied","Data":"fc5747838123bfc75426ff9629dc8f1b0124fb2c655d1fcd825880fb14ca938d"} Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.118539 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.235440 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565908-mg7ws"] Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.243059 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565908-mg7ws"] Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.270169 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj69l\" (UniqueName: \"kubernetes.io/projected/0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5-kube-api-access-kj69l\") pod \"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5\" (UID: \"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5\") " Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.279679 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5-kube-api-access-kj69l" (OuterVolumeSpecName: "kube-api-access-kj69l") pod "0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5" (UID: "0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5"). InnerVolumeSpecName "kube-api-access-kj69l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.374599 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj69l\" (UniqueName: \"kubernetes.io/projected/0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5-kube-api-access-kj69l\") on node \"crc\" DevicePath \"\"" Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.754742 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" event={"ID":"0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5","Type":"ContainerDied","Data":"bb278fa18fc0ddb7f51b4af2064392f626ccd315e57ef479bbc86d15e831b219"} Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.755074 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb278fa18fc0ddb7f51b4af2064392f626ccd315e57ef479bbc86d15e831b219" Mar 19 21:14:06 crc kubenswrapper[4799]: I0319 21:14:06.754907 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565914-pbdq4" Mar 19 21:14:07 crc kubenswrapper[4799]: I0319 21:14:07.129723 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72726a12-de75-4af2-be10-0aece53735ae" path="/var/lib/kubelet/pods/72726a12-de75-4af2-be10-0aece53735ae/volumes" Mar 19 21:14:11 crc kubenswrapper[4799]: I0319 21:14:11.117823 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:14:11 crc kubenswrapper[4799]: E0319 21:14:11.119630 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.530640 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8r7p"] Mar 19 21:14:14 crc kubenswrapper[4799]: E0319 21:14:14.532190 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5" containerName="oc" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.532215 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5" containerName="oc" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.532611 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="0072d17e-1ed9-4ab2-b0ae-fd33ed2efff5" containerName="oc" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.535217 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.552881 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8r7p"] Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.578328 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-utilities\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.578452 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-catalog-content\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.578581 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2swx\" (UniqueName: \"kubernetes.io/projected/5e13dd6e-2405-4422-8bbd-fa89f02e9812-kube-api-access-x2swx\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.679287 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-utilities\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.679376 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-catalog-content\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.679518 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2swx\" (UniqueName: \"kubernetes.io/projected/5e13dd6e-2405-4422-8bbd-fa89f02e9812-kube-api-access-x2swx\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.679985 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-utilities\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.689788 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-catalog-content\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.715785 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2swx\" (UniqueName: \"kubernetes.io/projected/5e13dd6e-2405-4422-8bbd-fa89f02e9812-kube-api-access-x2swx\") pod \"certified-operators-g8r7p\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:14 crc kubenswrapper[4799]: I0319 21:14:14.906784 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:15 crc kubenswrapper[4799]: I0319 21:14:15.190925 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8r7p"] Mar 19 21:14:15 crc kubenswrapper[4799]: I0319 21:14:15.872158 4799 generic.go:334] "Generic (PLEG): container finished" podID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerID="3cbeb4668149201eba7b84065f338cd24845b34209046f65f940cf4fdad26fe6" exitCode=0 Mar 19 21:14:15 crc kubenswrapper[4799]: I0319 21:14:15.872218 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerDied","Data":"3cbeb4668149201eba7b84065f338cd24845b34209046f65f940cf4fdad26fe6"} Mar 19 21:14:15 crc kubenswrapper[4799]: I0319 21:14:15.872562 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerStarted","Data":"cba65f3263c75391cb84a2ad575bebd17f0a22fde4489260206e4a68c8cf805a"} Mar 19 21:14:17 crc kubenswrapper[4799]: I0319 21:14:17.911303 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerStarted","Data":"075fcd504191b4500b30dedd0451e2c794798ee58dd455e30626dd94a3e509ad"} Mar 19 21:14:18 crc kubenswrapper[4799]: I0319 21:14:18.922350 4799 generic.go:334] "Generic (PLEG): container finished" podID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerID="075fcd504191b4500b30dedd0451e2c794798ee58dd455e30626dd94a3e509ad" exitCode=0 Mar 19 21:14:18 crc kubenswrapper[4799]: I0319 21:14:18.922449 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerDied","Data":"075fcd504191b4500b30dedd0451e2c794798ee58dd455e30626dd94a3e509ad"} Mar 19 21:14:19 crc kubenswrapper[4799]: I0319 21:14:19.939887 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerStarted","Data":"4b94986e2eee446fd76ae7ce024033d28f78a67916d627a762f32c4ca8805e0a"} Mar 19 21:14:19 crc kubenswrapper[4799]: I0319 21:14:19.981729 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8r7p" podStartSLOduration=2.503082875 podStartE2EDuration="5.98170662s" podCreationTimestamp="2026-03-19 21:14:14 +0000 UTC" firstStartedPulling="2026-03-19 21:14:15.876058764 +0000 UTC m=+4133.482011836" lastFinishedPulling="2026-03-19 21:14:19.354682509 +0000 UTC m=+4136.960635581" observedRunningTime="2026-03-19 21:14:19.965458901 +0000 UTC m=+4137.571411983" watchObservedRunningTime="2026-03-19 21:14:19.98170662 +0000 UTC m=+4137.587659702" Mar 19 21:14:22 crc kubenswrapper[4799]: I0319 21:14:22.855688 4799 scope.go:117] "RemoveContainer" containerID="34f5f7d8f558c4c1caef74b5c2de59a685314cd52232e096fa0e4251532464e8" Mar 19 21:14:24 crc kubenswrapper[4799]: I0319 21:14:24.116372 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:14:24 crc kubenswrapper[4799]: E0319 21:14:24.117087 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:14:24 crc kubenswrapper[4799]: I0319 21:14:24.907306 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:24 crc kubenswrapper[4799]: I0319 21:14:24.907374 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:25 crc kubenswrapper[4799]: I0319 21:14:25.019621 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:25 crc kubenswrapper[4799]: I0319 21:14:25.094125 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:27 crc kubenswrapper[4799]: I0319 21:14:27.860142 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8r7p"] Mar 19 21:14:27 crc kubenswrapper[4799]: I0319 21:14:27.860852 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g8r7p" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="registry-server" containerID="cri-o://4b94986e2eee446fd76ae7ce024033d28f78a67916d627a762f32c4ca8805e0a" gracePeriod=2 Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.055951 4799 generic.go:334] "Generic (PLEG): container finished" podID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerID="4b94986e2eee446fd76ae7ce024033d28f78a67916d627a762f32c4ca8805e0a" exitCode=0 Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.056029 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerDied","Data":"4b94986e2eee446fd76ae7ce024033d28f78a67916d627a762f32c4ca8805e0a"} Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.454230 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.579006 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2swx\" (UniqueName: \"kubernetes.io/projected/5e13dd6e-2405-4422-8bbd-fa89f02e9812-kube-api-access-x2swx\") pod \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.579079 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-utilities\") pod \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.579377 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-catalog-content\") pod \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\" (UID: \"5e13dd6e-2405-4422-8bbd-fa89f02e9812\") " Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.580692 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-utilities" (OuterVolumeSpecName: "utilities") pod "5e13dd6e-2405-4422-8bbd-fa89f02e9812" (UID: "5e13dd6e-2405-4422-8bbd-fa89f02e9812"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.589363 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e13dd6e-2405-4422-8bbd-fa89f02e9812-kube-api-access-x2swx" (OuterVolumeSpecName: "kube-api-access-x2swx") pod "5e13dd6e-2405-4422-8bbd-fa89f02e9812" (UID: "5e13dd6e-2405-4422-8bbd-fa89f02e9812"). InnerVolumeSpecName "kube-api-access-x2swx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.655687 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e13dd6e-2405-4422-8bbd-fa89f02e9812" (UID: "5e13dd6e-2405-4422-8bbd-fa89f02e9812"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.682157 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.682195 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2swx\" (UniqueName: \"kubernetes.io/projected/5e13dd6e-2405-4422-8bbd-fa89f02e9812-kube-api-access-x2swx\") on node \"crc\" DevicePath \"\"" Mar 19 21:14:28 crc kubenswrapper[4799]: I0319 21:14:28.682209 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e13dd6e-2405-4422-8bbd-fa89f02e9812-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.065618 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8r7p" event={"ID":"5e13dd6e-2405-4422-8bbd-fa89f02e9812","Type":"ContainerDied","Data":"cba65f3263c75391cb84a2ad575bebd17f0a22fde4489260206e4a68c8cf805a"} Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.066053 4799 scope.go:117] "RemoveContainer" containerID="4b94986e2eee446fd76ae7ce024033d28f78a67916d627a762f32c4ca8805e0a" Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.066247 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8r7p" Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.090791 4799 scope.go:117] "RemoveContainer" containerID="075fcd504191b4500b30dedd0451e2c794798ee58dd455e30626dd94a3e509ad" Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.109475 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8r7p"] Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.129895 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g8r7p"] Mar 19 21:14:29 crc kubenswrapper[4799]: I0319 21:14:29.392923 4799 scope.go:117] "RemoveContainer" containerID="3cbeb4668149201eba7b84065f338cd24845b34209046f65f940cf4fdad26fe6" Mar 19 21:14:31 crc kubenswrapper[4799]: I0319 21:14:31.132036 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" path="/var/lib/kubelet/pods/5e13dd6e-2405-4422-8bbd-fa89f02e9812/volumes" Mar 19 21:14:36 crc kubenswrapper[4799]: I0319 21:14:36.118673 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:14:36 crc kubenswrapper[4799]: E0319 21:14:36.121447 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:14:47 crc kubenswrapper[4799]: I0319 21:14:47.116789 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:14:47 crc kubenswrapper[4799]: E0319 21:14:47.117998 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.166929 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8"] Mar 19 21:15:00 crc kubenswrapper[4799]: E0319 21:15:00.168376 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="extract-content" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.168448 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="extract-content" Mar 19 21:15:00 crc kubenswrapper[4799]: E0319 21:15:00.168525 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="extract-utilities" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.168545 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="extract-utilities" Mar 19 21:15:00 crc kubenswrapper[4799]: E0319 21:15:00.168599 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="registry-server" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.168630 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="registry-server" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.169016 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e13dd6e-2405-4422-8bbd-fa89f02e9812" containerName="registry-server" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.170218 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.175349 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.175644 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.183087 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8"] Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.334771 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpt94\" (UniqueName: \"kubernetes.io/projected/730c9ff4-bb0f-45a7-90bc-bbca574aca95-kube-api-access-jpt94\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.334833 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/730c9ff4-bb0f-45a7-90bc-bbca574aca95-secret-volume\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.334991 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730c9ff4-bb0f-45a7-90bc-bbca574aca95-config-volume\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.437113 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730c9ff4-bb0f-45a7-90bc-bbca574aca95-config-volume\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.437451 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpt94\" (UniqueName: \"kubernetes.io/projected/730c9ff4-bb0f-45a7-90bc-bbca574aca95-kube-api-access-jpt94\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.438054 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/730c9ff4-bb0f-45a7-90bc-bbca574aca95-secret-volume\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.438671 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730c9ff4-bb0f-45a7-90bc-bbca574aca95-config-volume\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.458316 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/730c9ff4-bb0f-45a7-90bc-bbca574aca95-secret-volume\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.477473 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpt94\" (UniqueName: \"kubernetes.io/projected/730c9ff4-bb0f-45a7-90bc-bbca574aca95-kube-api-access-jpt94\") pod \"collect-profiles-29565915-577k8\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:00 crc kubenswrapper[4799]: I0319 21:15:00.524268 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:01 crc kubenswrapper[4799]: I0319 21:15:01.028066 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8"] Mar 19 21:15:01 crc kubenswrapper[4799]: W0319 21:15:01.037715 4799 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730c9ff4_bb0f_45a7_90bc_bbca574aca95.slice/crio-ba1c1b40189dd1a704f99fec91f24b98df64c635dd74b70d3aa8524b58674f1e WatchSource:0}: Error finding container ba1c1b40189dd1a704f99fec91f24b98df64c635dd74b70d3aa8524b58674f1e: Status 404 returned error can't find the container with id ba1c1b40189dd1a704f99fec91f24b98df64c635dd74b70d3aa8524b58674f1e Mar 19 21:15:01 crc kubenswrapper[4799]: I0319 21:15:01.116953 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:15:01 crc kubenswrapper[4799]: E0319 21:15:01.117275 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:15:01 crc kubenswrapper[4799]: I0319 21:15:01.519735 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" event={"ID":"730c9ff4-bb0f-45a7-90bc-bbca574aca95","Type":"ContainerStarted","Data":"357e1692bcbd794a7a133016d68542451f0f24884467e43f95c67733ffea0126"} Mar 19 21:15:01 crc kubenswrapper[4799]: I0319 21:15:01.520026 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" event={"ID":"730c9ff4-bb0f-45a7-90bc-bbca574aca95","Type":"ContainerStarted","Data":"ba1c1b40189dd1a704f99fec91f24b98df64c635dd74b70d3aa8524b58674f1e"} Mar 19 21:15:01 crc kubenswrapper[4799]: I0319 21:15:01.544421 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" podStartSLOduration=1.544373222 podStartE2EDuration="1.544373222s" podCreationTimestamp="2026-03-19 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 21:15:01.536914811 +0000 UTC m=+4179.142867913" watchObservedRunningTime="2026-03-19 21:15:01.544373222 +0000 UTC m=+4179.150326314" Mar 19 21:15:02 crc kubenswrapper[4799]: I0319 21:15:02.532997 4799 generic.go:334] "Generic (PLEG): container finished" podID="730c9ff4-bb0f-45a7-90bc-bbca574aca95" containerID="357e1692bcbd794a7a133016d68542451f0f24884467e43f95c67733ffea0126" exitCode=0 Mar 19 21:15:02 crc kubenswrapper[4799]: I0319 21:15:02.534161 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" event={"ID":"730c9ff4-bb0f-45a7-90bc-bbca574aca95","Type":"ContainerDied","Data":"357e1692bcbd794a7a133016d68542451f0f24884467e43f95c67733ffea0126"} Mar 19 21:15:03 crc kubenswrapper[4799]: I0319 21:15:03.942705 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.117374 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730c9ff4-bb0f-45a7-90bc-bbca574aca95-config-volume\") pod \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.117637 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpt94\" (UniqueName: \"kubernetes.io/projected/730c9ff4-bb0f-45a7-90bc-bbca574aca95-kube-api-access-jpt94\") pod \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.117694 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/730c9ff4-bb0f-45a7-90bc-bbca574aca95-secret-volume\") pod \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\" (UID: \"730c9ff4-bb0f-45a7-90bc-bbca574aca95\") " Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.118788 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/730c9ff4-bb0f-45a7-90bc-bbca574aca95-config-volume" (OuterVolumeSpecName: "config-volume") pod "730c9ff4-bb0f-45a7-90bc-bbca574aca95" (UID: "730c9ff4-bb0f-45a7-90bc-bbca574aca95"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.125238 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730c9ff4-bb0f-45a7-90bc-bbca574aca95-kube-api-access-jpt94" (OuterVolumeSpecName: "kube-api-access-jpt94") pod "730c9ff4-bb0f-45a7-90bc-bbca574aca95" (UID: "730c9ff4-bb0f-45a7-90bc-bbca574aca95"). InnerVolumeSpecName "kube-api-access-jpt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.127049 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730c9ff4-bb0f-45a7-90bc-bbca574aca95-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "730c9ff4-bb0f-45a7-90bc-bbca574aca95" (UID: "730c9ff4-bb0f-45a7-90bc-bbca574aca95"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.220155 4799 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730c9ff4-bb0f-45a7-90bc-bbca574aca95-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.220195 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpt94\" (UniqueName: \"kubernetes.io/projected/730c9ff4-bb0f-45a7-90bc-bbca574aca95-kube-api-access-jpt94\") on node \"crc\" DevicePath \"\"" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.220210 4799 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/730c9ff4-bb0f-45a7-90bc-bbca574aca95-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.553771 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" event={"ID":"730c9ff4-bb0f-45a7-90bc-bbca574aca95","Type":"ContainerDied","Data":"ba1c1b40189dd1a704f99fec91f24b98df64c635dd74b70d3aa8524b58674f1e"} Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.553812 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1c1b40189dd1a704f99fec91f24b98df64c635dd74b70d3aa8524b58674f1e" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.553889 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565915-577k8" Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.650183 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2"] Mar 19 21:15:04 crc kubenswrapper[4799]: I0319 21:15:04.665853 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-zdhj2"] Mar 19 21:15:05 crc kubenswrapper[4799]: I0319 21:15:05.136625 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67210496-c7b2-4ab5-aae0-23e19f947c67" path="/var/lib/kubelet/pods/67210496-c7b2-4ab5-aae0-23e19f947c67/volumes" Mar 19 21:15:05 crc kubenswrapper[4799]: I0319 21:15:05.564825 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerID="17ce42f55b19e5a443b00a691e5a37bf87b31daea99677cbdf8502d6721ae952" exitCode=0 Mar 19 21:15:05 crc kubenswrapper[4799]: I0319 21:15:05.564880 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" event={"ID":"2ac7839c-b697-426a-b4e6-50559cca8e79","Type":"ContainerDied","Data":"17ce42f55b19e5a443b00a691e5a37bf87b31daea99677cbdf8502d6721ae952"} Mar 19 21:15:05 crc kubenswrapper[4799]: I0319 21:15:05.565297 4799 scope.go:117] "RemoveContainer" containerID="17ce42f55b19e5a443b00a691e5a37bf87b31daea99677cbdf8502d6721ae952" Mar 19 21:15:05 crc kubenswrapper[4799]: I0319 21:15:05.859351 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8kxv_must-gather-xcmfm_2ac7839c-b697-426a-b4e6-50559cca8e79/gather/0.log" Mar 19 21:15:14 crc kubenswrapper[4799]: I0319 21:15:14.116923 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:15:14 crc kubenswrapper[4799]: E0319 21:15:14.118138 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:15:16 crc kubenswrapper[4799]: I0319 21:15:16.542237 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8kxv/must-gather-xcmfm"] Mar 19 21:15:16 crc kubenswrapper[4799]: I0319 21:15:16.542914 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="copy" containerID="cri-o://ae87fb236e7291e3573e283542834c8c207af2be87c5357f6260ed83b1831af3" gracePeriod=2 Mar 19 21:15:16 crc kubenswrapper[4799]: I0319 21:15:16.558689 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8kxv/must-gather-xcmfm"] Mar 19 21:15:16 crc kubenswrapper[4799]: I0319 21:15:16.704967 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8kxv_must-gather-xcmfm_2ac7839c-b697-426a-b4e6-50559cca8e79/copy/0.log" Mar 19 21:15:16 crc kubenswrapper[4799]: I0319 21:15:16.706634 4799 generic.go:334] "Generic (PLEG): container finished" podID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerID="ae87fb236e7291e3573e283542834c8c207af2be87c5357f6260ed83b1831af3" exitCode=143 Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.010435 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8kxv_must-gather-xcmfm_2ac7839c-b697-426a-b4e6-50559cca8e79/copy/0.log" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.011011 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.206239 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ac7839c-b697-426a-b4e6-50559cca8e79-must-gather-output\") pod \"2ac7839c-b697-426a-b4e6-50559cca8e79\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.206393 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4htt\" (UniqueName: \"kubernetes.io/projected/2ac7839c-b697-426a-b4e6-50559cca8e79-kube-api-access-q4htt\") pod \"2ac7839c-b697-426a-b4e6-50559cca8e79\" (UID: \"2ac7839c-b697-426a-b4e6-50559cca8e79\") " Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.212585 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac7839c-b697-426a-b4e6-50559cca8e79-kube-api-access-q4htt" (OuterVolumeSpecName: "kube-api-access-q4htt") pod "2ac7839c-b697-426a-b4e6-50559cca8e79" (UID: "2ac7839c-b697-426a-b4e6-50559cca8e79"). InnerVolumeSpecName "kube-api-access-q4htt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.307913 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4htt\" (UniqueName: \"kubernetes.io/projected/2ac7839c-b697-426a-b4e6-50559cca8e79-kube-api-access-q4htt\") on node \"crc\" DevicePath \"\"" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.383294 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ac7839c-b697-426a-b4e6-50559cca8e79-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2ac7839c-b697-426a-b4e6-50559cca8e79" (UID: "2ac7839c-b697-426a-b4e6-50559cca8e79"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.410235 4799 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ac7839c-b697-426a-b4e6-50559cca8e79-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.716329 4799 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8kxv_must-gather-xcmfm_2ac7839c-b697-426a-b4e6-50559cca8e79/copy/0.log" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.716834 4799 scope.go:117] "RemoveContainer" containerID="ae87fb236e7291e3573e283542834c8c207af2be87c5357f6260ed83b1831af3" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.716957 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8kxv/must-gather-xcmfm" Mar 19 21:15:17 crc kubenswrapper[4799]: I0319 21:15:17.741530 4799 scope.go:117] "RemoveContainer" containerID="17ce42f55b19e5a443b00a691e5a37bf87b31daea99677cbdf8502d6721ae952" Mar 19 21:15:19 crc kubenswrapper[4799]: I0319 21:15:19.130538 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" path="/var/lib/kubelet/pods/2ac7839c-b697-426a-b4e6-50559cca8e79/volumes" Mar 19 21:15:23 crc kubenswrapper[4799]: I0319 21:15:23.004817 4799 scope.go:117] "RemoveContainer" containerID="bc7a0bc263571ef8c2f0a8a58e18f261ba8239ed3d8ebe1d3ce04936e7b059be" Mar 19 21:15:28 crc kubenswrapper[4799]: I0319 21:15:28.117537 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:15:28 crc kubenswrapper[4799]: E0319 21:15:28.118954 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:15:40 crc kubenswrapper[4799]: I0319 21:15:40.116954 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:15:40 crc kubenswrapper[4799]: E0319 21:15:40.117547 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:15:53 crc kubenswrapper[4799]: I0319 21:15:53.131047 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:15:53 crc kubenswrapper[4799]: E0319 21:15:53.132144 4799 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mv84p_openshift-machine-config-operator(cf986000-80c1-4cf1-8648-d2f7ee370e88)\"" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" podUID="cf986000-80c1-4cf1-8648-d2f7ee370e88" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.085095 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rl4wl"] Mar 19 21:15:56 crc kubenswrapper[4799]: E0319 21:15:56.086030 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="copy" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.086045 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="copy" Mar 19 21:15:56 crc kubenswrapper[4799]: E0319 21:15:56.086073 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730c9ff4-bb0f-45a7-90bc-bbca574aca95" containerName="collect-profiles" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.086079 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="730c9ff4-bb0f-45a7-90bc-bbca574aca95" containerName="collect-profiles" Mar 19 21:15:56 crc kubenswrapper[4799]: E0319 21:15:56.086091 4799 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="gather" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.086097 4799 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="gather" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.086258 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="730c9ff4-bb0f-45a7-90bc-bbca574aca95" containerName="collect-profiles" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.086271 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="gather" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.086287 4799 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac7839c-b697-426a-b4e6-50559cca8e79" containerName="copy" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.087764 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.099224 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl4wl"] Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.249611 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72skf\" (UniqueName: \"kubernetes.io/projected/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-kube-api-access-72skf\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.249701 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-catalog-content\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.249845 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-utilities\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.351652 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-catalog-content\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.351829 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-utilities\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.351982 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72skf\" (UniqueName: \"kubernetes.io/projected/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-kube-api-access-72skf\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.352637 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-utilities\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.352771 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-catalog-content\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.376348 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72skf\" (UniqueName: \"kubernetes.io/projected/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-kube-api-access-72skf\") pod \"redhat-marketplace-rl4wl\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.424963 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:15:56 crc kubenswrapper[4799]: I0319 21:15:56.721850 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl4wl"] Mar 19 21:15:57 crc kubenswrapper[4799]: I0319 21:15:57.161832 4799 generic.go:334] "Generic (PLEG): container finished" podID="f46d35e0-0d66-4ccc-a984-b995d0f1e21f" containerID="87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d" exitCode=0 Mar 19 21:15:57 crc kubenswrapper[4799]: I0319 21:15:57.161878 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerDied","Data":"87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d"} Mar 19 21:15:57 crc kubenswrapper[4799]: I0319 21:15:57.161903 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerStarted","Data":"3a6dd41a1cc06842bf59da696dec97d1f9db807e52fb2e36e39b53ff23078850"} Mar 19 21:15:58 crc kubenswrapper[4799]: I0319 21:15:58.174712 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerStarted","Data":"ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec"} Mar 19 21:15:59 crc kubenswrapper[4799]: I0319 21:15:59.192277 4799 generic.go:334] "Generic (PLEG): container finished" podID="f46d35e0-0d66-4ccc-a984-b995d0f1e21f" containerID="ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec" exitCode=0 Mar 19 21:15:59 crc kubenswrapper[4799]: I0319 21:15:59.192333 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerDied","Data":"ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec"} Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.152025 4799 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565916-9746s"] Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.153942 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.155810 4799 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvv7r\" (UniqueName: \"kubernetes.io/projected/bac61594-d1cd-4f21-a174-a157fb2ee2e0-kube-api-access-mvv7r\") pod \"auto-csr-approver-29565916-9746s\" (UID: \"bac61594-d1cd-4f21-a174-a157fb2ee2e0\") " pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.156432 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.156678 4799 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fgnsj" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.156934 4799 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.162992 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565916-9746s"] Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.206178 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerStarted","Data":"f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b"} Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.231747 4799 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rl4wl" podStartSLOduration=1.831418721 podStartE2EDuration="4.2317253s" podCreationTimestamp="2026-03-19 21:15:56 +0000 UTC" firstStartedPulling="2026-03-19 21:15:57.164049391 +0000 UTC m=+4234.770002483" lastFinishedPulling="2026-03-19 21:15:59.56435598 +0000 UTC m=+4237.170309062" observedRunningTime="2026-03-19 21:16:00.225307429 +0000 UTC m=+4237.831260501" watchObservedRunningTime="2026-03-19 21:16:00.2317253 +0000 UTC m=+4237.837678372" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.257335 4799 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvv7r\" (UniqueName: \"kubernetes.io/projected/bac61594-d1cd-4f21-a174-a157fb2ee2e0-kube-api-access-mvv7r\") pod \"auto-csr-approver-29565916-9746s\" (UID: \"bac61594-d1cd-4f21-a174-a157fb2ee2e0\") " pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.282136 4799 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvv7r\" (UniqueName: \"kubernetes.io/projected/bac61594-d1cd-4f21-a174-a157fb2ee2e0-kube-api-access-mvv7r\") pod \"auto-csr-approver-29565916-9746s\" (UID: \"bac61594-d1cd-4f21-a174-a157fb2ee2e0\") " pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.479647 4799 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:00 crc kubenswrapper[4799]: I0319 21:16:00.984213 4799 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565916-9746s"] Mar 19 21:16:02 crc kubenswrapper[4799]: I0319 21:16:02.226749 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565916-9746s" event={"ID":"bac61594-d1cd-4f21-a174-a157fb2ee2e0","Type":"ContainerStarted","Data":"9735c908c702d0893441dfde0cd5b6b21bc1d8453922cb5605b99caad440c8e0"} Mar 19 21:16:04 crc kubenswrapper[4799]: I0319 21:16:04.117882 4799 scope.go:117] "RemoveContainer" containerID="f5c109c01a7453a726a1e9222e89115a469b39989d412e34b586a3383db3abc8" Mar 19 21:16:04 crc kubenswrapper[4799]: I0319 21:16:04.250090 4799 generic.go:334] "Generic (PLEG): container finished" podID="bac61594-d1cd-4f21-a174-a157fb2ee2e0" containerID="50690323a39c939f619bbe9c74cea2c39ed44bb350d5461853db50071a36fc02" exitCode=0 Mar 19 21:16:04 crc kubenswrapper[4799]: I0319 21:16:04.250208 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565916-9746s" event={"ID":"bac61594-d1cd-4f21-a174-a157fb2ee2e0","Type":"ContainerDied","Data":"50690323a39c939f619bbe9c74cea2c39ed44bb350d5461853db50071a36fc02"} Mar 19 21:16:05 crc kubenswrapper[4799]: I0319 21:16:05.265461 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mv84p" event={"ID":"cf986000-80c1-4cf1-8648-d2f7ee370e88","Type":"ContainerStarted","Data":"ddd1f99518b1b9c042ac499c0f873f93bec2d5952c589023522a628d20896d0d"} Mar 19 21:16:05 crc kubenswrapper[4799]: I0319 21:16:05.670703 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:05 crc kubenswrapper[4799]: I0319 21:16:05.697963 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvv7r\" (UniqueName: \"kubernetes.io/projected/bac61594-d1cd-4f21-a174-a157fb2ee2e0-kube-api-access-mvv7r\") pod \"bac61594-d1cd-4f21-a174-a157fb2ee2e0\" (UID: \"bac61594-d1cd-4f21-a174-a157fb2ee2e0\") " Mar 19 21:16:05 crc kubenswrapper[4799]: I0319 21:16:05.707734 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac61594-d1cd-4f21-a174-a157fb2ee2e0-kube-api-access-mvv7r" (OuterVolumeSpecName: "kube-api-access-mvv7r") pod "bac61594-d1cd-4f21-a174-a157fb2ee2e0" (UID: "bac61594-d1cd-4f21-a174-a157fb2ee2e0"). InnerVolumeSpecName "kube-api-access-mvv7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:16:05 crc kubenswrapper[4799]: I0319 21:16:05.800839 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvv7r\" (UniqueName: \"kubernetes.io/projected/bac61594-d1cd-4f21-a174-a157fb2ee2e0-kube-api-access-mvv7r\") on node \"crc\" DevicePath \"\"" Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.278935 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565916-9746s" event={"ID":"bac61594-d1cd-4f21-a174-a157fb2ee2e0","Type":"ContainerDied","Data":"9735c908c702d0893441dfde0cd5b6b21bc1d8453922cb5605b99caad440c8e0"} Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.279309 4799 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9735c908c702d0893441dfde0cd5b6b21bc1d8453922cb5605b99caad440c8e0" Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.279022 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565916-9746s" Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.425778 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.425838 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.491361 4799 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.751190 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565910-57gjx"] Mar 19 21:16:06 crc kubenswrapper[4799]: I0319 21:16:06.763258 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565910-57gjx"] Mar 19 21:16:07 crc kubenswrapper[4799]: I0319 21:16:07.129301 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75cbce3-390b-43f5-9a1f-00fae47ac87a" path="/var/lib/kubelet/pods/e75cbce3-390b-43f5-9a1f-00fae47ac87a/volumes" Mar 19 21:16:07 crc kubenswrapper[4799]: I0319 21:16:07.341400 4799 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:16:07 crc kubenswrapper[4799]: I0319 21:16:07.386540 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl4wl"] Mar 19 21:16:09 crc kubenswrapper[4799]: I0319 21:16:09.307300 4799 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rl4wl" podUID="f46d35e0-0d66-4ccc-a984-b995d0f1e21f" containerName="registry-server" containerID="cri-o://f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b" gracePeriod=2 Mar 19 21:16:09 crc kubenswrapper[4799]: I0319 21:16:09.815421 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.002244 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-utilities\") pod \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.002333 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72skf\" (UniqueName: \"kubernetes.io/projected/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-kube-api-access-72skf\") pod \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.002411 4799 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-catalog-content\") pod \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\" (UID: \"f46d35e0-0d66-4ccc-a984-b995d0f1e21f\") " Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.003421 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-utilities" (OuterVolumeSpecName: "utilities") pod "f46d35e0-0d66-4ccc-a984-b995d0f1e21f" (UID: "f46d35e0-0d66-4ccc-a984-b995d0f1e21f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.016598 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-kube-api-access-72skf" (OuterVolumeSpecName: "kube-api-access-72skf") pod "f46d35e0-0d66-4ccc-a984-b995d0f1e21f" (UID: "f46d35e0-0d66-4ccc-a984-b995d0f1e21f"). InnerVolumeSpecName "kube-api-access-72skf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.040187 4799 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f46d35e0-0d66-4ccc-a984-b995d0f1e21f" (UID: "f46d35e0-0d66-4ccc-a984-b995d0f1e21f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.105048 4799 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.105082 4799 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72skf\" (UniqueName: \"kubernetes.io/projected/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-kube-api-access-72skf\") on node \"crc\" DevicePath \"\"" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.105096 4799 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46d35e0-0d66-4ccc-a984-b995d0f1e21f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.324762 4799 generic.go:334] "Generic (PLEG): container finished" podID="f46d35e0-0d66-4ccc-a984-b995d0f1e21f" containerID="f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b" exitCode=0 Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.324868 4799 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl4wl" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.324865 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerDied","Data":"f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b"} Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.325339 4799 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl4wl" event={"ID":"f46d35e0-0d66-4ccc-a984-b995d0f1e21f","Type":"ContainerDied","Data":"3a6dd41a1cc06842bf59da696dec97d1f9db807e52fb2e36e39b53ff23078850"} Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.325435 4799 scope.go:117] "RemoveContainer" containerID="f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.367638 4799 scope.go:117] "RemoveContainer" containerID="ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.387364 4799 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl4wl"] Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.395545 4799 scope.go:117] "RemoveContainer" containerID="87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.397800 4799 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl4wl"] Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.451991 4799 scope.go:117] "RemoveContainer" containerID="f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b" Mar 19 21:16:10 crc kubenswrapper[4799]: E0319 21:16:10.452567 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b\": container with ID starting with f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b not found: ID does not exist" containerID="f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.452619 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b"} err="failed to get container status \"f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b\": rpc error: code = NotFound desc = could not find container \"f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b\": container with ID starting with f0e241493dd93c5f27cd5aa323224510957f1ebe303eff7716a483e30ec9292b not found: ID does not exist" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.452652 4799 scope.go:117] "RemoveContainer" containerID="ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec" Mar 19 21:16:10 crc kubenswrapper[4799]: E0319 21:16:10.453150 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec\": container with ID starting with ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec not found: ID does not exist" containerID="ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.453194 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec"} err="failed to get container status \"ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec\": rpc error: code = NotFound desc = could not find container \"ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec\": container with ID starting with ea22d00526d1c4c2d1ddb19e669cb164e52882ff3e24c9bad856431a18d280ec not found: ID does not exist" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.453225 4799 scope.go:117] "RemoveContainer" containerID="87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d" Mar 19 21:16:10 crc kubenswrapper[4799]: E0319 21:16:10.453514 4799 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d\": container with ID starting with 87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d not found: ID does not exist" containerID="87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d" Mar 19 21:16:10 crc kubenswrapper[4799]: I0319 21:16:10.453553 4799 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d"} err="failed to get container status \"87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d\": rpc error: code = NotFound desc = could not find container \"87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d\": container with ID starting with 87372405ff2c56604eed21be7268653ae18aa2a2f660229c1cb6eb872435bc9d not found: ID does not exist" Mar 19 21:16:11 crc kubenswrapper[4799]: I0319 21:16:11.135659 4799 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46d35e0-0d66-4ccc-a984-b995d0f1e21f" path="/var/lib/kubelet/pods/f46d35e0-0d66-4ccc-a984-b995d0f1e21f/volumes" Mar 19 21:16:23 crc kubenswrapper[4799]: I0319 21:16:23.137055 4799 scope.go:117] "RemoveContainer" containerID="1966c12b34a526de3afd55eec54f30ca10d2016bf32ade30143207199e6e38e6" Mar 19 21:16:23 crc kubenswrapper[4799]: I0319 21:16:23.182886 4799 scope.go:117] "RemoveContainer" containerID="79e64b595c0d7e76ec6508db0176eaaabd9a1256f1ca3fa82e9a975db0c85d18" Mar 19 21:16:23 crc kubenswrapper[4799]: I0319 21:16:23.270231 4799 scope.go:117] "RemoveContainer" containerID="618be630872dfbf30231362d3d16b6dc89594f38c3bb4104e058e28e59acd3f9"